...
Seems like you are all talk with zero real world experience... unlike you I've been building my own PC gaming rigs since childhood and playing PC games since I can remember, I've bought basically ever major release of every Nvidia GPU that ever came out so far.... I put my money where my mouth is and I live and breathe the real life on the streets and anyone who really knows anything will tell you that Moore's Law is dead... I'm not saying there aren't incremental improvements, but maybe you don't understand the definition of Moore's Law. Moore's Law, as defined, has died a long time ago, the mistake is yours.
The reality on the ground is there is barely no real inventory for ANY of the RTX3000 series right now, even though they launched starting last summer.... and it doesn't appear the inventory issue will be resolved this year... just because you won't buy from scalper doesn't mean the availability isn't an issue, it all goes to show that theoretical progress on paper means nothing if it cannot be brought to market in mass and on time, not to mention the fact that the GPU is now getting ridiculously large, using ridiculous amounts of power/watts, producing ridiculously amounts of heat/waste and this all hides the fact that real progress and real performance improvements gains is grind to a halt...
There was a period of time in which progress did track Moore's Law but you trying to tell me that the slowdown I'm obviously seeing isn't real goes against the face of reality.
Nice joke kid. My first computer was a Commodore 64 and I assemble my own PCs from components.
You seem to have either enough of a lack of knowledge of the computing market that you get ripped off, or enough money that you don't care how you spend it. I am a spendthrift sort of person so I would rather wait and ride out the market.
Memory prices in particular can be really volatile as production ramps up. They typically follow 2 year cycles.
GPUs have had high prices ever since the coin mining craze came up. Yet if you know enough about that market, you can make some guesses. Smaller GPU dies like the Navi are cheaper to produce. When Bitmain and company made ASIC mining of Bitcoin possible the demand for GPUs dropped like a rock. That is basically when I bought my GPU card. The coin miner craze only came up again with regards to GPUs recently with the Ethereum mining craze. Initially Ethereum didn't even have a GPU accelerated version. But that is only going to be an issue until the ASICs for Ethereum become widespread. Right now the ASICs don't provide enough of a performance boost since the Ethereum algorithm is quite complicated to accelerate with hardware. But it isn't impossible.
NVIDIA's GPU sucks for several reasons. The die is huge. You bought a graphics card with a chip which is 628.4 mm². That means the yields on it are going to be awful. Availability will be low and prices high. The GPU chip I bought was 251 mm². Then there is the fact NVIDIA switched to Samsung 8nm while AMD uses TSMC 7nm. Samsung's process is presently less advanced than TSMC's. Plus NVIDIA designers were not used to Samsung's process. It is well known in the industry that NVIDIA goofed up with this one.
To be honest with the market's current craze if I wanted to build a PC right now I would stick with an old card or use integrated graphics. But that is me. I seldom play the latest AAA titles.
Some people have had success buying 2nd hand cards from coin miners over AliExpress for peanuts, flashing the firmware, and outputting over integrated graphics. A coin miner card basically doesn't have video outputs, yet you can typically pass through the video over to the integrated graphics with some work, those guys dump their older cards once they get new ones. They typically sell them for scrap prices. But it is a bit of a gamble. I don't do crap like that.
Just look at the industry. TSMC plans to fab 55000 wafers per month at 3nm in 2022 and 105000 wafers per month in 2023. So I would expect the GPU market to normalize in 2 years.
Moore's Law merely denotes an exponential trend but the speed of advancement, that is the time between each iteration, is not constant. At times advancement has been faster, and at other times it has been slower. Around the time processors were at the 1 GHz speed advancement was really quick. But this was not always the case. Like I said, there is a roadmap for advancing lithography for the next decade. Who knows what will happen after that.