Artificial Intelligence thread

nugroho

Junior Member
It´s not difficult to understand why the Chinese govt is not thrilled about the prospect of Chinese AI companies importing more US GPUs from Nvidia. Why help cement US hegemony in chip sales?

View attachment 167844


The US compute advantage is real and massive. Yet, while compute is important, it is also not everything. Meta had a huge budget but didn't deliver much. DeepSeek and others show that high talent density punches way above its weight. I would not bet against Chinese AI companies, but we have to acknowledge that they will have compute disadvantages compared to their US peers for years to come. It's just silly to pretend otherwise.

And here's broken down by country. Note: this includes Chinese companies buying NV GPUs.

View attachment 167846
" EPOCH " ?? something fishy
 

tphuang

General
Staff member
Super Moderator
VIP Professional
Registered Member
Qwen's open-source multimodal model is quite impressive, but its Qwen3 MAX is already significantly behind, and thinking still hasn't been fixed. I haven't used it for a long time.
different AI labs serve different purposes. Qwen is the cornerstone of all the edge AI stuff.
 

iewgnem

Captain
Registered Member
China‘s total computing capacity is around 788 EFLOPS, while US get around 10 million H100 which is roughly equivalent to 10,000 EFLOPS, so 10x is reasonable.
But the total revenue of AI model is around 1/10 of the cost of buying these chips, so 10x also means huge deficit.
Please, Log in or Register to view URLs content!
Please, Log in or Register to view URLs content!
Do you realize, considering Chinese AI companies are clearly fully matching US offerings, that what you're actually implying is American AI is so bad they need to spend 10x more on compute to match China.
 

lockedemosthenes1

New Member
Registered Member
Do you realize, considering Chinese AI companies are clearly fully matching US offerings, that what you're actually implying is American AI is so bad they need to spend 10x more on compute to match China.
not exactly, because perhaps training AI model doesn't need 10x computing power, and a large part of the H100/200 chips are for model inference.
 
Top