Miscellaneous News

siegecrossbow

General
Staff member
Super Moderator
Please, Log in or Register to view URLs content!

So this is essentially saying that DeepSeek has saved Apple's arrsse

This add to the complications of a DeepSeek ban. Companies at the forefront like OpenAi are obviously against it but backwards companies like Apple will need it to level the playing field. We will see an epic wizard duel between tech lobbyists very soon.
 

daifo

Major
Registered Member
Nvidia down almost 4% today. Now the stock is almost 20% down since deepseek came out

Seems hobbyist are hosting deepseek on non-nvidia machines/ They are building them with macs and intel with large amt of ram. However, they all suffer limited token rates. Maybe some people expect there would be improvement with the inference side with added designs on future cpus thereby less need for nvdia specific chips.
 

4Runner

Junior Member
Registered Member
Seems hobbyist are hosting deepseek on non-nvidia machines/ They are building them with macs and intel with large amt of ram. However, they all suffer limited token rates. Maybe some people expect there would be improvement with the inference side with added designs on future cpus thereby less need for nvdia specific chips.
Right, hobbyists don't have to have nvidia graphic cards with CUDA in order to run DeepSeek R1 locally. I was running DeepSeek R1 with Ollama distilling Llama with 8B on my Ubuntu 24.04 laptop yesterday. It was very slow but works. And the setup could scale up or scale out if I was willing to spend more time and money. And that I believe is what people are saying: proverbial "genie" cannot be put back to "bottle". Technically anyone can setup its own DeepSeek AI service in his/her garage if he/she is willing to spend $100K and month. "Cultural Revolution of AI" that is ...
 

BlackWindMnt

Captain
Registered Member
Seems hobbyist are hosting deepseek on non-nvidia machines/ They are building them with macs and intel with large amt of ram. However, they all suffer limited token rates. Maybe some people expect there would be improvement with the inference side with added designs on future cpus thereby less need for nvdia specific chips.
Im gonna bet Huawei, Microsoft, AWS, Apple are all allocating their semi design houses to fix that problem.

I think the big problem right now is memory size on devices for the bigger and more useful models. I think the tech that intel and micron made "3d xpoint" non volatile memory might be a nice solution. 3d xpoint can act like a storage but with memory like bandwidth. For normal programs there was never really a nice usecase outside of really high performance usages.

Please, Log in or Register to view URLs content!
 
Last edited:

proelite

Junior Member
This add to the complications of a DeepSeek ban. Companies at the forefront like OpenAi are obviously against it but backwards companies like Apple will need it to level the playing field. We will see an epic wizard duel between tech lobbyists very soon.

OpenAI + Anthropic + Softbank
vs
Google + Meta + Microsoft + Apple

Who do you have your money on?
 
Top