AI is kinda overrated. The Japanese Government decided in the 1980s that they were going to skip regular CPUs and go into so called 5th generation computer hardware (i.e. AI).
It was a bust. It is a fad that comes and goes as new techniques become popular and people get inflated expectations out of them.
You have to learn to walk before you learn to run.
The South Koreans were a lot more focused. Samsung saw that consumer devices were going to be more based on electronics in the future and decided to invest on chip manufacturing to pursue that goal. To build actual devices people would use and buy.
Once you go into abstract goalposts like "AI" as a vague buzzword you seldom get anything useful like what happened to the Japanese. What you get is a quagmire. "AI" is used in several important applications but it is a wide and diverse field so claiming you are investing into AI is not saying much at all. It is used in things like quality control in factories with machine vision algorithms. Text OCR. It is used in speech recognition and synthesis. It is also now becoming more used in applications like automated driving. Namely in obstacle detection, classification, avoidance, and path finding. So in that sense it is important today and in the future. All of these areas have been beaten to death in the past in the robotics field with slowly made progress which is typically measured in decades not years. IBM worked on speech recognition like five decades ago. They had practical implementations. But it was considered an unnecessary luxury in most cases so only much later did you see widespread use of it. Now every smartphone does it.
The USA Government at the same time as the Japanese Government also made investments into AI. The only program which can be called a success over that entire decade in USA govt funded research was when the US Army (not DARPA) invested into software to optimize military logistics. That was put into practical use and saved a considerable amount of resources to the US Army and was used later in other areas.
However if you do not even have the basic means to design and manufacture chips, computer volatile and non-volatile memory, or build decent quality automobiles, perhaps you should have other priorities than "AI". Preferably something you can measure and describe. Saying your goal is "AI" is kinda like doing a "War on Drugs" or a "War on Terrorism". These are nice catch phrases but how can you even define goals on something like that when it is so nebulously phrased? Compare that objective with "we will put a man on the moon by the end of this decade". Now that is well defined a political objective with a limited scope. It might be a waste of resources but at least it was clearly defined so you can track progress.
In most cases where people mention "AI" these are either software or hardware implementations of algorithms. But claiming it is a post Von Neumman architecture in the cases that people implement AI today, like on smartphone SoCs, is hogwash. There are proposals of alternative architectures. Like DARPA's research program with graph computing which could have applications in both AI and elsewhere. But that is not what you typically hear about. It could also be a bust in the long term just like other DARPA programs in the past like multi-flow architectures.
The Japanese are a bit early but actually their goal is not that far off Nowadays there are alternative to CPU or sequential machine notice the proliferation of different kind of processor like GBU, FPGA, ASIC, NEURAL
The Von Neuman machine depend on sequential operation
And at one time it will come across bottle neck sofar they have been able to overcome it by making the chip pack more transistor "Socalled Moore law" But Moore law is coming to an abrupt end since anything close to 5nm will cause interference and we are approaching that limit. Notice the failure of Intel in developing even 10 nm technology and the other FAB Global give up completely
Death of Moore’s Law
In 1965, Dr. Gordon E. Moore wrote an article based on a trend he noticed: the number of transistors in an integrated circuit (IC) doubles approximately every two years[1]. Fueled by unrelenting demands from more complex software, faster games and broadband video, Moore’s Law has held true for over 50 years.[2] It became the de-facto roadmap against which the semiconductor industry drove its R&D and chip production, as facilitated by SEMATECH’s pre-competitive agreements among global semiconductor manufacturers. Recently, that roadmap has faltered due to physics limitations and the high cost-benefit economics incurred by the incredibly small scales that chip manufacturing has reached. Electron leakages and difficulties shaping matter at the single-digit nanometer scales of the transistors fundamentally limit further miniaturization. So many electrons are being moved though such tight spaces so quickly that there is an entire field in the semiconductor industry devoted just to chip cooling; without thermal controls, the ICs simply fry and fail. A new fabrication plant (fab) can cost more than $10 billion, severely limiting the number of companies able to produce denser ICs.
Despite the looming end of Moore’s Law, computationally-intensive artificial intelligence (AI) has exploded in capabilities in the last few years – but how, if compute is slowing down? The solution to exceeding compute limitations of traditional von Neumann style central processing units (CPUs)[3] has been to invent and leverage wholly new architectures not dependent on such linear designs[4].
A veritable zoo of compute architectures – including GPUs[5], ASICs[6], FPGAs[7], quantum computers, neuromorphic chips, nanomaterial-based chips, optical-based ICs, and even biochemical architectures - are being researched and/or implemented to better enable deep learning and other instantiations of AI. Here we review the latest / greatest[8] of non-CPU computer architectures relevant to AI. In each section, we describe the hardware, its impact to AI, and a selection of companies and teams active in its R&D and commercialization.
So yeah the Von Neuman machine architecture is drawing to a close.
US and the west has commanding lead on the Von Neuman architecture due to the legacy. Because the semiconductor was invented in the US due to the work of Kirby from Texas Instrument and Noyce from Fairchild who latter founded the Intel in 1967
Now what is China doing in in 67 Cultural revolution wrecking all the industrial and research institution in China and the University did not open its door until 1980's
So your comparison to South Korea is faulty . South Korea had early start and uninterrupted industrial development since 1960's. And she has complete access to American technology and research and development
China was put on the technical embargo list first there is Cocom then Wessenar Now ITER
Another thing China was VERY late in semiconductor business. She spend the first 20 years after CR end just building the basic industrial base starting with pot and pan, clothing and plastic toy
Look it this way Huawei the premier Chinese high tech was only founded in 1987 and spend the first 10 year importing phone switches latter making it themselves So it was not until 2000 she start making their own switches
So basically of course the business volume was low in the early years since they only catering to domestic demand It was not until 2005 Huawei start exporting phone switches
As I said before the High tech industry DOES NOT create large demand for semiconductor until the last 10 years
So of course if there is no large why built large and expensive semiconductor It make more sense to import it
Now that there demand the government realize the vulnerability of high tech industry in China and start the program of building semi conductor FAB since none of the private industry has the capital, R&D effort etc
But it take time since even building the factory requiring exact construction
Concurrently the Semi conductor tool industry did not start in earnest until 2010
But now that everybody realized the strategic importance of semiconductor they are building FAB like there is no tomorrow. China has program to develop all the range of processor from GBU to ASIC. Some already has prototype and will commence production soon
And any effort to stymied it will come to naught since the technology has been diffused. Generation of Chinese scientist has learned the technology in the west . The problem is not enough of them and training will take some time
Right now there are plethora of Chinese semiconductor tool fabricator in China they might not have the latest technology but good enough for most application