Chinese semiconductor industry

Status
Not open for further replies.

gelgoog

Lieutenant General
Registered Member
Yeah China is basically forced to develop the entire fab supply chain which is something that if achieved, would bring China's to the level of Korea and Japan? The two closest to having the entire ecosystem? The US controls everything indirectly from Netherlands to Germany, South Korea to Japan, and Taiwan but they don't have it all under their roof. If China can manage this within 5 or so years, even to a degree that isn't market competitive, it would put it at an impressive and near peerless league (save for US which we could consider as the "owner" of the other five primary suppliers each with their own niches).

Why do you think they are incapable of doing so? When all that's left seems to be EUV lithography (with the others either mastered, near mastered or obsolete but still usable?) and EDA tools. That's two things within the domain of diminishing Moore's law silicon semiconductors within 5 years or so.

If I understand this field correctly, China's own is far from market leading but it's damn sure second place to the group of five suppliers Netherlands, Japan, Korea, Taiwan, and Germany held by the US puppet master. There are no third places. Niche fields exist in UK of course but when it comes to fab I don't think so? Russia and India are around the same level and well behind even this but they do have access to the entire set of supply chain equipment which China is now totally banned from. What China's got without the bans are market leading. Without the latest EUV lithography machines and EPA tools, China cannot be market competitive and can only depend on stockpiled supplies of chips for uses because it cannot manufacture <28nm without at least those two things.

No single country controls the entire ecosystem. The US is in a position where they can apply leverage on the entire market but they risk losing it all. Also do not expect Moore's law to be dead, or whatever, people have been claiming that for decades. Never happened.

There is a roadmap for semiconductors for the next decade at least and it has been like that since like forever. The relevant companies are working on next generation EUV tools and are making good progress. No one knows exactly what will happen in two decades but who cares.

You can put South Korea and Taiwan more or less in the same bag. They manufacture chips. They do not control their supplies or tools. Japan, the Netherlands, and the US have some tools vendors or component manufacturers and materials suppliers. Germany has some vanishingly small players on the market. Siemens owns Mentor Graphics but software development isn't made in Germany. Zeiss manufactures optics for lithography tools.

Russia has limitations on the types of semiconductor machine tools they can purchase. This has been the case since the time of the Soviet Union. Typically they never could purchase tools newer than 2-3 generations older than the current state of the art. This was done on purpose to keep Western semiconductors ahead of Soviet semiconductors.
Try reading about the "Wassenaar Arrangement" or "COCOM".

Remember in 2015 when Obama (and VP Joe Biden) banned Intel chips for Chinese Supercomputers? Didn't China develop their own indigenous chips for their Supercomputers that became among the top 3 or top 5 in the world?

So this is consistent of Obama-era officials, bans on supercomputing-related tech. This is no surprise actually. Still a significant level lower than Trump-era bans on actual Chinese tech (TikTok, WeChat, Huawei, ZTE, etc...).

It's largely symbolic action since Chinese already LEAD the US in supercomputing race in total Supercomputers, so Biden isn't even really targeting anything important.

China did develop their own supercomputer chips. And that supercomputer, the Sunway TaihuLight, became #1 (2016-2018).
That is what the US just sanctioned. The companies which designed the chips used in the Chinese supercomputers.

Namely Sunway, who designed the CPU used in the TaihuLight, and Phytium, who designed the FeiTeng CPU used in the Tianhe-1A.

Phytium went into the commercial market and later developed ARM based server chips. Unlike the chips they developed for the supercomputers these were manufactured at TSMC.
 
Last edited:

voyager1

Captain
Registered Member
Also do not expect Moore's law to be dead, or whatever, people have been claiming that for decades. Never happened.
Why not? The minimum Physical limits on semiconductors is at the 2nm because thats when quantum stuff kicks in and everything goes bad there.

So now they are trying to use other techniques and architectures such as 3D stacking, and other new materials which would continue the performance gains and power savings.

However on the new materials side everyone is, in theory, starting from the same position (in practise I would say that West is leading by 2-3 yrs)

So if previously China was always lagging 10 years on semiconductors, the new materials can allow it to leapfrog the competition.
 
Last edited:

quantumlight

Junior Member
Registered Member
The US is in a position where they can apply leverage on the entire market but they risk losing it all. Also do not expect Moore's law to be dead, or whatever, people have been claiming that for decades. Never happened.

Moore law died a long time ago.... back in the golden age of desktop computing I remember as a kid every year a new computer at Best Buy was twice as fast as the previous... Nowadays I upgraded to 9900k intel last year, after running the 4790k from over six years ago, saw basically zero improvement...

So CPU is dead, what about GPU where all the buzz is at? Well even here it is slowing down... nevermind there is no inventory and Nvidia is doing paper launches, back in 2016 I got the top of the line state of the art GPU at the time, the GTX980 for about $599 at Fry's ... today I got the top of the line RTX3090 which MSRP for much higher at $1499 but as you know there is almost zero inventory so I had to buy it on craiglists from scaplers at inflated cost of almost $3000 (right now its almost $3500) so basically its gone 5X more expensive... not to mention it takes up ALL THREE PCI slots in my machine and uses TWICE the power, produces TWICE or more in HEAT, all for marginal improvements... So it costs way more, uses way more energy, produces way more waste/heat, and takes up way more space...

Then its not even truly 4k. Even with the RTX3090 I cannot play Cyberpunk in 4k with RTX because it requires a "cheat" in the form of DLSS which is fancy term of saying using Nvidia "AI" to smartly upscale a lower resolution video image... so basically 4k is not even true 4k, its 1080p upscaled to 4k using "AI" ...

So don't tell me Moore's Law isn't dead, I see it got stepped on the neck with my own eyes... Plus you cannot get smaller than one atomic anyway, that is the hard barrier not counting quantum tunneling and other physical limits, so "never happened" is like those Peak Oil deniers who say "we will never run out of oil" not knowing in any finite environment everything will peak
 

gelgoog

Lieutenant General
Registered Member
Moore law died a long time ago.... back in the golden age of desktop computing I remember as a kid every year a new computer at Best Buy was twice as fast as the previous... Nowadays I upgraded to 9900k intel last year, after running the 4790k from over six years ago, saw basically zero improvement...

So CPU is dead, what about GPU where all the buzz is at? Well even here it is slowing down... nevermind there is no inventory and Nvidia is doing paper launches, back in 2016 I got the top of the line state of the art GPU at the time, the GTX980 for about $599 at Fry's ... today I got the top of the line RTX3090 which MSRP for much higher at $1499 but as you know there is almost zero inventory so I had to buy it on craiglists from scaplers at inflated cost of almost $3000 (right now its almost $3500) so basically its gone 5X more expensive... not to mention it takes up ALL THREE PCI slots in my machine and uses TWICE the power, produces TWICE or more in HEAT, all for marginal improvements... So it costs way more, uses way more energy, produces way more waste/heat, and takes up way more space...

Then its not even truly 4k. Even with the RTX3090 I cannot play Cyberpunk in 4k with RTX because it requires a "cheat" in the form of DLSS which is fancy term of saying using Nvidia "AI" to smartly upscale a lower resolution video image... so basically 4k is not even true 4k, its 1080p upscaled to 4k using "AI" ...

So don't tell me Moore's Law isn't dead, I see it got stepped on the neck with my own eyes... Plus you cannot get smaller than one atomic anyway, that is the hard barrier not counting quantum tunneling and other physical limits, so "never happened" is like those Peak Oil deniers who say "we will never run out of oil" not knowing in any finite environment everything will peak

That's where you made a mistake. You should have bought an AMD processor. Intel has sucked ever since Zen came out.

GPUs cost more today for several reasons. One being lack of competition in the market. I bought an ATI card a couple of years back when Navi came out and it was expensive, but not that expensive. Games typically are optimized to run on console level graphics which typically are several generations behind desktop PCs anyway so it makes little difference for gaming. NVIDIA charges a premium and they today use Samsung manufacturing processes which are worse than TSMC. NVIDIA is having a bit of a conflict with TSMC because they priority allocate production to Apple. The market is also chocked with GPU miners and people stuck at home buying up cards.

Sometimes chip designs plain suck. Intel Itanium sucked and it wasn't because of semiconductor process technology. The Voodoo 5 sucked and it wasn't because of semiconductor process technology.

Cyberpunk is a piss poor optimized piece of software. They can't even get the bugs out let alone make it run fast.

I would never buy a GPU from a scalper. I would just wait another year until prices get normal again. I bought an AMD Zen CPU in 2017 when it came out for peanuts. Back then memory and GPUs were horridly expensive so I just shelved the CPU and waited until prices came down in 2019, bought the components, and assembled the computer. I bought it at retail price.
Please, Log in or Register to view URLs content!

We will worry about hitting the atomic limit when it happens. I still remember a couple of years back people saying the hard disks would never increase in capacity because of the superparamagnetic limit and it was bullshit too. Sure, the capacity doesn't increase as quickly but it still does.
 
Last edited:

redswift

Just Hatched
Registered Member
Moore's Law is dead as progress have slowed considerably since the 1990s. After launching Sandy Bridge in 2011, Intel basically stood still and just rehash their existing technology with minimal improvements. This was a mistake as it allowed AMD to finally catch up with the Ryzen processors.

We have reached the point of diminishing returns which will make it much easier for China to catch up.
 

voyager1

Captain
Registered Member
OK here the sanctions for the supercomputers.

Please, Log in or Register to view URLs content!
The sanctions could mean some Chinese companies may no longer be able to outsource the manufacture of custom computer chips to overseas foundries such as Samsung in South Korea and TSMC in Taiwan, according to Xiao Limin, an adviser to the government on supercomputer technology.
So no advanced chip manufacturing. IMO in contrast to the gov spin, thats big.

Because the more advanced the chip the biggest the power saving costs and thus bigger electricity bill and better efficiency.

So now the supercomputer companies will be in a competitive advantage (for commercial mostly)

Let the China gov say what it wants, but this is another humiliation for the Chinese nation.

Let the Chinese people be humiliated if that is neccessary, so they can start kicking out these stupid CEOs and managers who use international IP
 

manqiangrexue

Brigadier
Let the China gov say what it wants, but this is another humiliation for the Chinese nation.

Let the Chinese people be humiliated if that is neccessary, so they can start kicking out these stupid CEOs and managers who use international IP
Why's everything such a big humiliation or a big problem with you? This is normal for competition of this scale; if the US sanctioned off its IP to try to prevent you from kicking their asses with it, then you make your own and you make it better. It's a sign that's China's doing it right or the US wouldn't be concerned and wouldn't be inflicting harm/imposing export bans on its own companies to try to stop China. That's how it always goes. Does the US need to watch China maul American tech with American IP for things to be normal? To beat your enemy, you're gonna enter a phase where he's using his cards against you, then you'll enter another phase when he's desperate and using double-edged cards against you, and then the final phase when is when he's out of cards and accepts defeat; the enemy that worries me the most is the one the just looks at you and smiles, seemingly doing nothing. I don't know why it's such a big humiliation that your enemy is making moves to stop you from beating him with its own tools. Get over it and get over the dramatic rhetoric.
 

voyager1

Captain
Registered Member
Why's everything such a big humiliation or a big problem with you? This is normal for competition of this scale; if the US sanctioned off its IP to try to prevent you from kicking their asses with it, then you make your own and you make it better. That's how it always goes. Does the US need to watch China maul American tech with American IP for things to be normal? I don't know why it's such a big humiliation that your enemy is making moves to stop you from beating him with its own tools. Get over it and get over the dramatic rhetoric.
But I didn't criticise the US, did I? I actually think that the US was too late to sanction them. If I was the US I would have start restricting China from buying chips long ago (the same time Trump started his sanctions).

The reason why I am saying it is a humiliation for China is that even after all this trade war and then the now tech war, these companies continued using US IP and they still worship foreign technology.

So if you were a Chinese person, hypothetically speaking, wouldn't you feel humiliated if your country's companies were still choosing foreign tech instead of developing their own?

I tell you what, if I was Chinese, I would hang my head with shame for having my fellow Chinese betraying the country every day by choosing to use US IP tech. IMO, the state media should run with this story day in, day out, to make the people understand what is happening and turn them against these treasonous companies.
 

manqiangrexue

Brigadier
But I didn't criticise the US, did I? I actually think that the US was too late to sanction them. If I was the US I would have start restricting China from buying chips long ago (the same time Trump started his sanctions).
Obama started it, and it didn't go well for America. Here they are repeating the same experiment hoping for a different result.
The reason why I am saying it is a humiliation for China is that even after all this trade war and then the now tech war, these companies continued using US IP and they still worship foreign technology.
This is not "all this trade and tech war." This has only started. It's very early and China's tech system was created as one that was globally-integrated. To replace every US component and IP takes time. You're looking at the beginning of a tech war for the end result and feeling humiliated that you're not finding it.
So if you were a Chinese person, hypothetically speaking, wouldn't you feel humiliated if your country's companies were still choosing foreign tech instead of developing their own?

I tell you what, if I was Chinese, I would hang my head with shame for having my fellow Chinese betraying the country every day by choosing to use US IP tech. IMO, the state media should run with this story day in, day out, to make the people understand what is happening and turn them against these treasonous companies.
Are the choosing US/foreign tech despite the availability of Chinese options because they just think foreign is better or are they using this foreign tech because it is the only viable option in the short term as China develops its own alternatives? There's a big difference. The former is humiliating but the latter is survival strategy.
 
Last edited:

quantumlight

Junior Member
Registered Member
That's where you made a mistake. You should have bought an AMD processor. Intel has sucked ever since Zen came out.

GPUs cost more today for several reasons. One being lack of competition in the market. I bought an ATI card a couple of years back when Navi came out and it was expensive, but not that expensive. Games typically are optimized to run on console level graphics which typically are several generations behind desktop PCs anyway so it makes little difference for gaming. NVIDIA charges a premium and they today use Samsung manufacturing processes which are worse than TSMC. NVIDIA is having a bit of a conflict with TSMC because they priority allocate production to Apple. The market is also chocked with GPU miners and people stuck at home buying up cards.

Sometimes chip designs plain suck. Intel Itanium sucked and it wasn't because of semiconductor process technology. The Voodoo 5 sucked and it wasn't because of semiconductor process technology.

Cyberpunk is a piss poor optimized piece of software. They can't even get the bugs out let alone make it run fast.

I would never buy a GPU from a scalper. I would just wait another year until prices get normal again. I bought an AMD Zen CPU in 2017 when it came out for peanuts. Back then memory and GPUs were horridly expensive so I just shelved the CPU and waited until prices came down in 2019, bought the components, and assembled the computer. I bought it at retail price.
Please, Log in or Register to view URLs content!

We will worry about hitting the atomic limit when it happens. I still remember a couple of years back people saying the hard disks would never increase in capacity because of the superparamagnetic limit and it was bullshit too. Sure, the capacity doesn't increase as quickly but it still does.
I used cyberpunk as one example, but the trend is the same no matter what current AAA title PC game you use it to compare to... Cyberpunk is actually well optimized for high end PCs, I'm not talking about consoles. Its even worse if you want to compare it to other recent titles like Microsoft Flight Simulator 2020 etc etc at 4k with ultra settings most people cannot even get above 45fps on Flight Sim with the latest and greatest RTX3090...

Seems like you are all talk with zero real world experience... unlike you I've been building my own PC gaming rigs since childhood and playing PC games since I can remember, I've bought basically ever major release of every Nvidia GPU that ever came out so far.... I put my money where my mouth is and I live and breathe the real life on the streets and anyone who really knows anything will tell you that Moore's Law is dead... I'm not saying there aren't incremental improvements, but maybe you don't understand the definition of Moore's Law. Moore's Law, as defined, has died a long time ago, the mistake is yours.

The reality on the ground is there is barely no real inventory for ANY of the RTX3000 series right now, even though they launched starting last summer.... and it doesn't appear the inventory issue will be resolved this year... just because you won't buy from scalper doesn't mean the availability isn't an issue, it all goes to show that theoretical progress on paper means nothing if it cannot be brought to market in mass and on time, not to mention the fact that the GPU is now getting ridiculously large, using ridiculous amounts of power/watts, producing ridiculously amounts of heat/waste and this all hides the fact that real progress and real performance improvements gains is grind to a halt... not to mention even at MSRP (which in and of itself is fake, Nvidia knows it cannot pump out the FE cards at MSRP so its a fake number) the price has been skyrocketing for GPU... taking all these factors into consideration I'd say not only is Moore's Law dead, we are actually moving backwards...

There was a period of time in which progress did track Moore's Law but you trying to tell me that the slowdown I'm obviously seeing isn't real goes against the face of actual reality.
 
Last edited:
Status
Not open for further replies.
Top