I find this a very good article. Guys, allow me to post this and I will keep quiet for a whole week.
Agree and disagree. The disagree is far more stronger than the agree part for me. I explain why.
The article was quite good, in the way they tried to explain things. What AI equipment or chips were needed. What the motivations were. What kind of trends are developing in the industry.
That is quite good, because he tries to explain.
To people who do not follow tech on a regular basis, then this article should be considered good.
I don't this article because it is atypical American echo chamber.
echo chamber ... echo chamber ...
So I would think a lot of people in this thread, would say the article makes a few points, but still a bad article.
For example, these two paragraphs were written.
HPC systems and AI have been on the path toward convergence for some time, as the introduction of AI — particularly in the form of machine learning — promises to open new frontiers for supercomputing. Exascale machines will contain on the order of 135,000 GPUs and 50,000 CPUs, and each of those chips will have many individual processing units, requiring engineers to write programs that execute almost a billion instructions simultaneously. Already, of the HPCs on the Top 500 list of most powerful computers, more than 100 use Nvidia GPUs, including in China, primarily A100s and V100s, for acceleration and optimization of math intensive workloads.
These same GPUs, coupled with high end CPUs, and other semiconductors such as field programmable gate arrays (FPGAs), are all being used by U.S. and Chinese firms to run AI training algorithms in the cloud, along with application-specific integrated circuits (ASIC) designed to run specific types of AI algorithms, like the Tesla ASICs used for video processing to assist full self-driving and autopilot. Other ASICs from startups like the U.K.’s Graphcore and U.S. firm Cerebras are being designed and manufactured by advanced foundries and incorporated into large systems and used to optimize running of AI workloads.
At least they kind of know what is going on.
But ... like ... big hairy deal. Everyone kind of knows this.
The article says there is a convergence of AI and HPC. Meaning artificial intelligence is using high performance computing.
Like duh! What else will AI need? But faster and better machines?
(For faster and better machines, we need faster chips! And better and innovative ways of linking them to work together, which China has demonstrated considerable skill in doing that).
In the end, I did not like this article, although it has merit, I did not like it because that is what people discuss everyday here in this thread, and in more detail.
In the end, the article did not offer anything new either.