antiterror13
Brigadier
Moore law died a long time ago.... back in the golden age of desktop computing I remember as a kid every year a new computer at Best Buy was twice as fast as the previous... Nowadays I upgraded to 9900k intel last year, after running the 4790k from over six years ago, saw basically zero improvement...
So CPU is dead, what about GPU where all the buzz is at? Well even here it is slowing down... nevermind there is no inventory and Nvidia is doing paper launches, back in 2016 I got the top of the line state of the art GPU at the time, the GTX980 for about $599 at Fry's ... today I got the top of the line RTX3090 which MSRP for much higher at $1499 but as you know there is almost zero inventory so I had to buy it on craiglists from scaplers at inflated cost of almost $3000 (right now its almost $3500) so basically its gone 5X more expensive... not to mention it takes up ALL THREE PCI slots in my machine and uses TWICE the power, produces TWICE or more in HEAT, all for marginal improvements... So it costs way more, uses way more energy, produces way more waste/heat, and takes up way more space...
Then its not even truly 4k. Even with the RTX3090 I cannot play Cyberpunk in 4k with RTX because it requires a "cheat" in the form of DLSS which is fancy term of saying using Nvidia "AI" to smartly upscale a lower resolution video image... so basically 4k is not even true 4k, its 1080p upscaled to 4k using "AI" ...
So don't tell me Moore's Law isn't dead, I see it got stepped on the neck with my own eyes... Plus you cannot get smaller than one atomic anyway, that is the hard barrier not counting quantum tunneling and other physical limits, so "never happened" is like those Peak Oil deniers who say "we will never run out of oil" not knowing in any finite environment everything will peak
Moore's Law is not about the size of lithography .. but the number of transistors double every 1.5/2 years. You still could do that with the same size of litography ... 3D Chip is one of the options