Chinese semiconductor industry

Status
Not open for further replies.

latenlazy

Brigadier
there's also the
Please, Log in or Register to view URLs content!
. at some point simple die shrinks or even new transistor designs may not be enough to deal with the problems of thermal stress and leakage voltage. there will need to be new paradigms such as 3D packaging,
Please, Log in or Register to view URLs content!
,
Please, Log in or Register to view URLs content!
, etc.
Heck even transitioning to FETs was a bit of last minute save for the industry (and lots of fabs were not able to make that jump). It’s kind of remarkable all the little tricks they’re figuring out to keep field effect transistors still relevant at smaller and smaller feature sizes but at some point I think they’re going to run out of things they can pull from that magic hat.

Personally, my money is on photonics making the leap into logic processors in the next ten years. I’ve been watching people in the silicon photonics space slowly edging their way into designing photonics based logic circuits. If that’s the next leap though it will change everything about the factors of competition, and break open the field for those who couldn't get into the node shrink game.
 

olalavn

Senior Member
Registered Member
Is it really that big of a disadvantage to fall behind in advanced nodes? Last month I built a new PC for my mom. A 12th gen i7 (10nm ESF) to replace her 10 years old 3rd gen i5 (22nm). The only reason for the new PC was that my mom got a new 62 megapixels camera, and she hates having FastRAWViewer in her workflow. So a faster processor was needed to preview RAWs quicker in DXO Photolab. However, the 62 megapixel sensor of the new camera already out-resolves most full frame optics, thus there's no reason ever to upgrade to a higher megapixel camera body (a bigger format camera is physically unusable for my mom), and no reason ever to get a better processor for post-processing.

If Huawei can come up with fully domestic 7nm chips (I know it's years away) I'll switch to Huawei phones. I see zero reason for better processor on my phone now that a phone's power consumption is dominated by screen backlighting. I'll drag my father to the Huawei camp too. He's on the market for a new phone and his biggest requirement is a good microphone to use with karaoke apps.

For business/government computers, my impression is that most of the trendy applications for processing power are pretty parallelizable, and China is so far ahead of everyone else in renewable energy that efficiency for non-mobile applications are desirable but not necessary (except crypto mining, which is banned in China anyway).

I used to think the capacity to cut a country off from the newest and greatest GPUs to trigger a nerd revolt was a very dangerous power in the hands of US government, but gaming GPUs were practically unavailable for two years and no government was overthrown because of that, so now I don't know.
maybe, Hi-end kirin will be back at the end of 2024
 

FairAndUnbiased

Brigadier
Registered Member
Heck even transitioning to FETs was a bit of last minute save for the industry (and lots of fabs were not able to make that jump). It’s kind of remarkable all the little tricks they’re figuring out to keep field effect transistors still relevant at smaller and smaller feature sizes but at some point I think they’re going to run out of things they can pull from that magic hat.

Personally, my money is on photonics making the leap into logic processors in the next ten years. I’ve been watching people in the silicon photonics space slowly edging their way into designing photonics based logic circuits. If that’s the next leap though it will change everything about the factors of competition, and break open the field for those who couldn't get into the node shrink game.
So I don't know much about photonics. You can definitely transmit information optically. You can translate it to electronics via photodiodes so you can couple with electronics only devices like memory. But how does optical logic interact with electronic logic?

That's why I still think that it'll still be electronics, just with different materials and new designs. There is still a ton of performance to squeeze out from electronics.
 

latenlazy

Brigadier
So I don't know much about photonics. You can definitely transmit information optically. You can translate it to electronics via photodiodes so you can couple with electronics only devices like memory. But how does optical logic interact with electronic logic?

That's why I still think that it'll still be electronics, just with different materials and new designs. There is still a ton of performance to squeeze out from electronics.
The idea is that you do logic computations optically and then convert the output back to electrical. The idea is you can do way more computes more quickly and more efficiently with a photonics circuit than the penalty for signal conversion. I guess the other direction you could go though is with materials like graphene, but either way I feel like we are reaching the end of silicon’s capacity to shrink down feature sizes very soon.
 

hvpc

Junior Member
Registered Member
The idea is that you do logic computations optically and then convert the output back to electrical. The idea is you can do way more computes more quickly and more efficiently with a photonics circuit than the penalty for signal conversion. I guess the other direction you could go though is with materials like graphene, but either way I feel like we are reaching the end of silicon’s capacity to shrink down feature sizes very soon.
Like the SCAMP-5 and SCAMP-7 chips that @tonyget shared a few hour ago?
 
Status
Not open for further replies.
Top