Chinese semiconductor industry

Status
Not open for further replies.

antiterror13

Brigadier
That is what these ignorant western analysts do not understand, the Chinese army does not need a EUV machine, they are quite independent in the manufacturing of integrated circuits, I have heard that CETC can manufacture up to 65 nm although in low volume. The chips used by the military do not need to be manufactured in the last nodes because reliability is the ultimate goal. The worst of the case is that they ignore two technological advances that China is seriously investing and that can give it a considerable military advantage. One is wide field Advanced packaging, like 3d stacking, heterogeneous integration. which would allow them to create new IC architectures with controllable physical characteristics without having to make a large investment. And the other area is new materials like silicon carbide and galium nitride. Integrated circuits made of silicon carbide can literally survive hell. None of that requires EUV or even Immersion.
In the end the only thing they got is to united Chinese companies in an effort to create an independent semiconductor industry and without significantly hurting the electronic capabilities of the Chinese army. Is blowback after blowback.

Great, thankyou

So CETC fabs use 100% Chinese equipment and software?

My other question is what lithography node is used for Chinese latest AESA which I am assuming all manufactured 100% Chinese?
 

FairAndUnbiased

Brigadier
Registered Member
I have read a lot of claim that military doesn't require state of the art and frontier chips. However having said that with advancement of intelligentization of military, how'd they achieve it with low performance chip?

Here's proof.
Please, Log in or Register to view URLs content!
which is
Please, Log in or Register to view URLs content!


As for requirements, they emphasize reliability in over say, speed and bandwidth for civilians. They typically use very sparse datalinks that send the minimum information possible (text, data, voice or low resolution video) while civilian communications require high bandwidth due to transmission of high definition video and interactive data like gaming.

In addition civilian communications has to handle millions of channels, and latency is a huge deal. A civilian will flip shit if they have 200 ms lag in DOTA.
 

caudaceus

Senior Member
Registered Member
Here's proof.
Please, Log in or Register to view URLs content!
which is
Please, Log in or Register to view URLs content!


As for requirements, they emphasize reliability in over say, speed and bandwidth for civilians. They typically use very sparse datalinks that send the minimum information possible (text, data, voice or low resolution video) while civilian communications require high bandwidth due to transmission of high definition video and interactive data like gaming.

In addition civilian communications has to handle millions of channels, and latency is a huge deal. A civilian will flip shit if they have 200 ms lag in DOTA.
So how will the military handle computerization and intelligentization then? I don't think they can still use 150nm chips for that
 

FairAndUnbiased

Brigadier
Registered Member
So how will the military handle computerization and intelligentization then? I don't think they can still use 150nm chips for that
why can't you? You just need to split functions up and accept larger package size. For instance a mobile phone has to squeeze a CPU, GPU and display driver into the SoC because splitting the functions up increases package size and power consumption.

Please, Log in or Register to view URLs content!
being fed predetermined geometric shape and alphanumeric data that doesn't require for instance real time 4K video. You can accept higher power consumption.
 

BlackWindMnt

Captain
Registered Member
So how will the military handle computerization and intelligentization then? I don't think they can still use 150nm chips for that
Maybe distribute computation, weapons need to be smart enough to hit the target at the last moment every moment before that you could use data centers on ship, in space or in the air to help with guiding the weapon close enough to a target.

I wouldn't be surprised if SpaceX is doing that with their low earth orbit star link satellite array. Putting enough computational power in space to guide weapons.
 

Weaasel

Senior Member
Registered Member
If that's true, then China could be the first country to have end to end total onshore semiconductor manufacturing. That's very impressive.

I really hope that Chinese society as a whole really learn the lesson here. The importance of having appreciation towards "hard tech" and its ecosystem, supply chain, etc and national awareness of whatever "tech war" that will happen in the future.
Please fund real estate less and more synbio
China should aim for that, because due to US pressure it remains vulnerable to cut off of supplies of necessities from foreign countries, even though many of those countries wouldn't want to cut China off, but they are bullied by the United States into doing so.
 

tokenanalyst

Brigadier
Registered Member
I have read a lot of claim that military doesn't require state of the art and frontier chips. However having said that with advancement of intelligentization of military, how'd they achieve it with low performance chip?
Reliability is more important for the military than performance using the latest semiconductor nodes were reliability is not guaranteed. Your fancy 3080ti 8nm GPU can crash in a game but the GPU that drive the screen of a stealth jet can't be allowed to fail in a critical mission, the auto industry is another great example. Even for the U.S. military in their weapon systems their main semiconductor contractor and the only pure-play foundry in the U.S. is Skywater technolgies and their most advanced node is 65nm. What is their focus? RAD-HARD, 3D-Soc, advanced packaging, novel architectures.

Please, Log in or Register to view URLs content!


Performance is a tricky word in a world were moore's law is ending, what's make an IC perform better than others? Is more complex than just the tech node.

The architecture? By example a well designed novel AI architecture no necessary need the latest transistors node to outperform.

The technology node? In recent years smaller technology nodes have not translated in faster computers in term of GHZ. The performance of a single 5nm CPU will be not that far of the performance single 45nm CPU with the same transistor count, the later will consume a lot of more energy but they will perform almost the same.

Transistor count? Yes, you can get more transistors with smaller nodes but you can also do it with 3d stacking for example if power consumption is not a problem. Like in the case of the threadripper of AMD.

energy consumption? THIS is the biggest driver for smaller technology nodes and this affect more the civilian world than the military world, electric energy is costly and consumers demand the best performance-per-watt. The smartphone industry has been the strongest pushers for smaller tech nodes. because of their requirements, different from the Auto industry they don't care about the most stringent reliability.
 

Tam

Brigadier
Registered Member
Great, thankyou

So CETC fabs use 100% Chinese equipment and software?

My other question is what lithography node is used for Chinese latest AESA which I am assuming all manufactured 100% Chinese?
.
They started with 90nm for DSP, then 55nm then to 40nm. That was years ago. I presume the AESA on the 052D and the 055 might be using the 90nm based DSP. From 2018, they shifted to 40nm for the next gen DSP and moving into 28nm. In 2012, Huarui 1 was introduced by Institute 14 to be used on over 12 Institute 14 (NRIET) radars, and Soul Core 1 by Institute 38 for acoustic applications such as sonar. In 2018, Huarui 2 was introduced with a 40nm process. Huarui 3 will be using 28nm and features AI.
 
Last edited:

BoraTas

Captain
Registered Member
Reliability is more important for the military than performance using the latest semiconductor nodes were reliability is not guaranteed. Your fancy 3080ti 8nm GPU can crash in a game but the GPU that drive the screen of a stealth jet can't be allowed to fail in a critical mission, the auto industry is another great example. Even for the U.S. military in their weapon systems their main semiconductor contractor and the only pure-play foundry in the U.S. is Skywater technolgies and their most advanced node is 65nm. What is their focus? RAD-HARD, 3D-Soc, advanced packaging, novel architectures.

Please, Log in or Register to view URLs content!


Performance is a tricky word in a world were moore's law is ending, what's make an IC perform better than others? Is more complex than just the tech node.

The architecture? By example a well designed novel AI architecture no necessary need the latest transistors node to outperform.

The technology node? In recent years smaller technology nodes have not translated in faster computers in term of GHZ. The performance of a single 5nm CPU will be not that far of the performance single 45nm CPU with the same transistor count, the later will consume a lot of more energy but they will perform almost the same.

Transistor count? Yes, you can get more transistors with smaller nodes but you can also do it with 3d stacking for example if power consumption is not a problem. Like in the case of the threadripper of AMD.

energy consumption? THIS is the biggest driver for smaller technology nodes and this affect more the civilian world than the military world, electric energy is costly and consumers demand the best performance-per-watt. The smartphone industry has been the strongest pushers for smaller tech nodes. because of their requirements, different from the Auto industry they don't care about the most stringent reliability.
Another factor is. the increase in transistor numbers has mostly resulted in just bigger caches and more cores since 2011. Before that, especially until the mid-00s, more transistors were really enabling more advanced computer architectures. Coupled with clock speed increases, 50-60% performance increases per year were common. I think anything below 28 nm is great. China will be independent in 14 nm in 2-4 years. I don't think its national power will be hurt by the lack of advanced electronics.
 

Tam

Brigadier
Registered Member
.
They started with 90nm for DSP, then 55nm then to 40nm. That was years ago. I presume the AESA on the 052D and the 055 might be using the 90nm based DSP. From 2018, they shifted to 40nm for the next gen DSP and moving into 28nm. In 2012, Huarui 1 was introduced by Institute 14 to be used on over 12 Institute 14 (NRIET) radars, and Soul Core 1 by Institute 38 for acoustic applications such as sonar. In 2018, Huarui 2 was introduced with a 40nm process. Huarui 3 will be using 28nm and features AI.

Correction.

2012 - 55nm for Soul Core 1 DSP. I presume the same for Huarui 1 DSP, not 90nm. Huarui 1 features a quad core.
40nm for Huarui 2 DSP starting 2018.
28nm for Huarui 3 DSP when if not already.

Please, Log in or Register to view URLs content!
Please, Log in or Register to view URLs content!
Please, Log in or Register to view URLs content!

J-20 AESA radar uses 3D stacking. Sez so here.

XTIBKSQIBFHG25SELP4GFN5754-1024x1019.jpg
 
Status
Not open for further replies.
Top