Artificial Intelligence thread

supercat

Major
Steve Hsu and David P Goldman write about DeepSeek:
There are seven major categories of AI applications in which the US and China compete. China is ahead in most of them and its AI prowess is likely to increase its lead. They are
  1. Manufacturing: China has poured enormous resources into factory automation. One gauge is the number of factories outfitted with dedicated 5G networks, which support AI applications. China claims 10,000 such installations, while the US has only a few dozen, concentrated in the auto industry. The advantage is strongly in China’s favor, and advances in AI are likely to enhance it. But US manufacturing has had small impact on equity valuations.
  2. Internet of Things: China is ahead in automating transportation and warehousing, with fully robotic warehouses now in operation.
  3. Robotics: China installs more industrial robots each year than the rest of the world combined and is now a major producer of industrial robots.
  4. China is the leader in the so-called low altitude economy, cited by government planners for the first time in a December 2024 working paper. Drone taxis, drone deliveries and other applications are now a $100 billion business in China and are expected to double by 2026.
  5. Autonomous vehicles: We’ll call this a toss-up between the US and China, although China already has autonomous taxi services operating on a small scale.
  6. Large Language Models: again, a toss-up. The gains to be harvested by LLMs include the Philippines’ $40 billion a year call center business, in which human operators can be replaced by AI systems to a considerable extent. But the possibilities for LLM applications are so varied and extensive that predictions are guesswork at this stage.
  7. Biotech: The US has a distinct advantage with a strong pharmaceutical development infrastructure. China has a lead in medical data, but America’s complex of large pharmaceutical companies, startups and venture capitalists give it an edge.
The big question mark over LLM monetization is timing. The payoff could be big but will probably take longer than expected to materialize.
Please, Log in or Register to view URLs content!
 

OptimusLion

Junior Member
Registered Member
With the help of domestic GPUs, Moore Threads has realized the deployment of DeepSeek distillation model inference service

DeepSeek open source models (such as V3 and R1 series) have demonstrated excellent performance in multi-language understanding and complex reasoning tasks. Moore Threads Intelligent Technology (Beijing) Co., Ltd. announced today that Moore Threads has realized the deployment of DeepSeek distillation model inference service.


001ZzMwgly1hy8ezosu58j60u00gfdmx02 (1).jpg

Please, Log in or Register to view URLs content!
 

OppositeDay

Senior Member
Registered Member
With the help of domestic GPUs, Moore Threads has realized the deployment of DeepSeek distillation model inference service

DeepSeek open source models (such as V3 and R1 series) have demonstrated excellent performance in multi-language understanding and complex reasoning tasks. Moore Threads Intelligent Technology (Beijing) Co., Ltd. announced today that Moore Threads has realized the deployment of DeepSeek distillation model inference service.


View attachment 144937

Please, Log in or Register to view URLs content!

Distilled models only. MooreThreads' best card has 48G of vram. Enough for bigger distilled models but nowhere near for the real thing.
 

tphuang

Lieutenant General
Staff member
Super Moderator
VIP Professional
Registered Member
Please, Log in or Register to view URLs content!
Reminder that gpt-4o is still really expensive at $2.50/ 1 million input tokens and $10/1 million output
o1 is $15/1m input token and $60/1m output token

Even if you use a Western host for deepseek like together.ai

Please, Log in or Register to view URLs content!

The fee is $1.25/1 million input/output token on V3 and $7 for the full version R1

The speed for together.ai is a lot faster than openai
 

Overbom

Brigadier
Registered Member
Please, Log in or Register to view URLs content!
Reminder that gpt-4o is still really expensive at $2.50/ 1 million input tokens and $10/1 million output
o1 is $15/1m input token and $60/1m output token

Even if you use a Western host for deepseek like together.ai

Please, Log in or Register to view URLs content!

The fee is $1.25/1 million input/output token on V3 and $7 for the full version R1

The speed for together.ai is a lot faster than openai
OpenRouter right now for DeepSeek R1
(price per million of tokens)
Please, Log in or Register to view URLs content!
1000008703.jpg
 
Top