Chinese semiconductor industry

Status
Not open for further replies.

BlackWindMnt

Captain
Registered Member
Spot on. Graphene chips are mostly a sci-fi feature. One of the greatest problems is you can not stack multiple graphene layers. It becomes plain graphite if you do that. We don't even have a single-atom-thick transistor design. What's worse is we don't even know how to design a single atom thick transistor because of quantum tunneling.
Graphene the forever lab material, but I heard it getting some R&D battery implementation.
 

jfcarli

Junior Member
Registered Member
Spot on. Graphene chips are mostly a sci-fi feature. One of the greatest problems is you can not stack multiple graphene layers. It becomes plain graphite if you do that. We don't even have a single-atom-thick transistor design. What's worse is we don't even know how to design a single atom thick transistor because of quantum tunneling.
About 90% of China's consumption of chips is >= 14nm. It still imports the vast majority of it.

If China concentrates on the over 7nm chips, it will reduce dramatically the money it gives away to western foundries. It will keep the money within China's territory, will allow more research, will increase overall supply chain security, etc..., etc..., etc...

It seems to me research on under 7nm , on EUV and exotic materials should go on, but at a lesser pace than guaranteeing supply of 7nm and above, which can be made with DUV and is cheaper and can break western countries legs by stopping to buy those chips from them.

Localization and mass production of over 7nm is the wise thing to do. Here is where the war economy should be applied.

Under 7nm is desirable but not a life threatening problem.

China has plenty of time to catch up.

My 2 cents.
 

AndrewS

Brigadier
Registered Member
About 90% of China's consumption of chips is >= 14nm. It still imports the vast majority of it.

If China concentrates on the over 7nm chips, it will reduce dramatically the money it gives away to western foundries. It will keep the money within China's territory, will allow more research, will increase overall supply chain security, etc..., etc..., etc...

It seems to me research on under 7nm , on EUV and exotic materials should go on, but at a lesser pace than guaranteeing supply of 7nm and above, which can be made with DUV and is cheaper and can break western countries legs by stopping to buy those chips from them.

Localization and mass production of over 7nm is the wise thing to do. Here is where the war economy should be applied.

Under 7nm is desirable but not a life threatening problem.

China has plenty of time to catch up.

My 2 cents.

If you're talking about a replacing all imports of DUV 7nm+ chips, then my guesstimate is 170+ fabs, as per the previous post based on 50% of current global capacity.

And if you're talking about a war economy delpoyment, you have to compress that into 2-3 years.
Ramping up that much will be a challenge, along with finding customers for these fabs.

Personally I think 5-10 years is feasible for mass deployment of say 200 fabs.
 

jfcarli

Junior Member
Registered Member
If you're talking about a replacing all imports of DUV 7nm+ chips, then my guesstimate is 170+ fabs, as per the previous post based on 50% of current global capacity.

And if you're talking about a war economy delpoyment, you have to compress that into 2-3 years.
Ramping up that much will be a challenge, along with finding customers for these fabs.

Personally I think 5-10 years is feasible for mass deployment of say 200 fabs.
Agreed! But the imports of chips over 7nm are an EXISTENTIAL THREAT to China. That is where the war economy should concentrate. The rest is desirable. It hurts, but does not kill.
 

Overbom

Brigadier
Registered Member
If you're talking about a replacing all imports of DUV 7nm+ chips, then my guesstimate is 170+ fabs, as per the previous post based on 50% of current global capacity.

And if you're talking about a war economy delpoyment, you have to compress that into 2-3 years.
Ramping up that much will be a challenge, along with finding customers for these fabs.

Personally I think 5-10 years is feasible for mass deployment of say 200 fabs.
You also need highly trained IC engineers for production. AFAIK there is a 400 000 talent gap for workers in the IC sector. Not sure on the exact number but I have seen it some studies
 

jfcarli

Junior Member
Registered Member
You also need highly trained IC engineers for production. AFAIK there is a 400 000 talent gap for workers in the IC sector. Not sure on the exact number but I have seen it some studies
One more reason to get down to it the soon as possible. Provide on the job training for the newcomers.
 

antonius123

Junior Member
Registered Member
About 90% of China's consumption of chips is >= 14nm. It still imports the vast majority of it.

If China concentrates on the over 7nm chips, it will reduce dramatically the money it gives away to western foundries. It will keep the money within China's territory, will allow more research, will increase overall supply chain security, etc..., etc..., etc...

It seems to me research on under 7nm , on EUV and exotic materials should go on, but at a lesser pace than guaranteeing supply of 7nm and above, which can be made with DUV and is cheaper and can break western countries legs by stopping to buy those chips from them.

Localization and mass production of over 7nm is the wise thing to do. Here is where the war economy should be applied.

Under 7nm is desirable but not a life threatening problem.

China has plenty of time to catch up.

My 2 cents.

China with abundant of money and talent can concentrate on boths, 14nm, 7nm, 5nm and below simultaneously
 

jfcarli

Junior Member
Registered Member
The threat of a complete chip blockade is far from negligible. Here is an article just published in Foreign Affairs, a rather important media.

It is a long article, but worth reading for all the crap it says. Here is what it says about semiconductors.

QUOTE

Washington also needs to do more to stymie Beijing’s plans to dominate semiconductor manufacturing. Chinese leaders are well aware that most twenty-first-century technologies—including 5G telecommunications, synthetic biology, and machine learning—are built around advanced semiconductors. Accordingly, those leaders have poured more than $100 billion in subsidies into building Chinese chip foundries, with mixed results.

Most of the world’s cutting-edge chips are produced by the Taiwan Semiconductor Manufacturing Company. The CCP has many ideological and strategic reasons to consider invading Taiwan; its quest for control of the market for chips represents an economic incentive to do so. Of course, a war could seriously damage Taiwan’s foundries, which, in any case, would struggle to maintain production without Western chip designs and equipment. And such a shock to chip supplies would affect millions of downstream jobs in China, not just those in other large economies. Even so, Beijing might believe that China could recover from a crisis more quickly than the United States. That is precisely the lesson Beijing drew from the COVID-19 pandemic, which has taken a far greater toll on China’s adversaries than on China itself. To be sure, Beijing would not take the fateful step of attacking Taiwan and risking war with the United States based on semiconductor inventories alone. The point is that Chinese leaders may not view the disruption of semiconductor supply chains as an inhibitor to launching a war.

Regardless of Beijing’s calculus, Washington should seek to eliminate any potential Chinese advantage in semiconductors by subsidizing new chip foundries in the United States—something the 2020 CHIPS Act and the 2021 U.S. Innovation and Competition Act seek to do. The U.S. Commerce Department must also slow Beijing’s efforts to scale up its foundries by applying sharper restrictions on the export of U.S.-made equipment used to manufacture semiconductors—not just for cutting-edge chips but also for those that are a couple of generations older.



UNQUOTE
 

krautmeister

Junior Member
Registered Member
Let's see:

No native bandgap so you can't switch it electronically in the native state, needs to be doped to get a bandgap
Spot on. Graphene chips are mostly a sci-fi feature. One of the greatest problems is you can not stack multiple graphene layers. It becomes plain graphite if you do that. We don't even have a single-atom-thick transistor design. What's worse is we don't even know how to design a single atom thick transistor because of quantum tunneling.
Graphene isn't going to be replacing silicon but graphene research led to research into other 2D materials like indium selenide which do have the bandgap. There was a recent post by @ansy1968 referring to a recent materials breakthrough by Yunnan University. There wasn't a lot of information in that video, but it sounded like they were able to develop transistors on a new 2D material with bandgap. If so, that means eventual logic circuits could be built with this new material and maybe eventually stacked layer upon layer because of the low power and good thermal performance. This is a really long way off, but it's not sci-fi like graphene. Even graphene can work with transistors by sandwiching graphene layers between something like boron nitride to create the bandgap barrier and then mounting this on silicon oxide to help control transistors. Definitely not practical but if they can do that with graphene, then it means other 2D materials with native bandgap will work given enough time.
 

ougoah

Brigadier
Registered Member
Graphene isn't going to be replacing silicon but graphene research led to research into other 2D materials like indium selenide which do have the bandgap. There was a recent post by @ansy1968 referring to a recent materials breakthrough by Yunnan University. There wasn't a lot of information in that video, but it sounded like they were able to develop transistors on a new 2D material with bandgap. If so, that means eventual logic circuits could be built with this new material and maybe eventually stacked layer upon layer because of the low power and good thermal performance. This is a really long way off, but it's not sci-fi like graphene. Even graphene can work with transistors by sandwiching graphene layers between something like boron nitride to create the bandgap barrier and then mounting this on silicon oxide to help control transistors. Definitely not practical but if they can do that with graphene, then it means other 2D materials with native bandgap will work given enough time.

The video mentions a method of using platinum sulfide that has been more applicable than graphene.
 
Status
Not open for further replies.
Top