Artificial Intelligence thread

luminary

Senior Member
Registered Member
Some o3 commentary
Basically it seems the test creator is an openAI shill and intentionally omitted a few things that put the credibility of the test in jeopardy.



Please, Log in or Register to view URLs content!

Looks like deepseek dropping something today
from their paper, training took 2.788M H800 GPU hours. For 2 months training run it's roughly only 2000 H800
Please, Log in or Register to view URLs content!
View attachment 141574
This is insanity. It is trained with 10% the compute it took to train LLama 405.

Hearing quite a few US analysts + engineers confused and coping this morning.
 
Last edited:

9dashline

Captain
Registered Member
Some o3 commentary
Basically it seems the test creator is an openAI shill and intentionally omitted a few things that put the credibility of the test in jeopardy.





This is insanity. It is trained with 10% the compute it took to train LLama 405.

Hearing quite a few US analysts + engineers confused and coping this morning.
I had a SORA feeling about o3..... it was supposed to be part of "shipmas" but not only was o3 not shipped, it was announced with only benchmarks and not even a live demonstration.... the only reason Altman would do that is because he had something to hide and was in fake it until make it mode, otherwise he would be bragging with real live examples... instead of just a rigged chart/ fake benchmark etc... and they even told the ARC AGI dude to not disclose how much compute was used and the hardware/cost lol.... probably millions of dollars for a fake horse and pony show

And after I purchased the o1 pro mode for $200, I really expected, wanted and hoped that pro version of SORA would be way better than Kling, but in all my tests, its worse than Kling...
 
Last edited:

9dashline

Captain
Registered Member
With that definition of AGI and the projection that OAI won't even make a profit until 2029, and now price pressure from DeepSeek, it doesn't seem like OAI will ever achieve "AGI" lol

Please, Log in or Register to view URLs content!
 

siegecrossbow

General
Staff member
Super Moderator
I had a SORA feeling about o3..... it was supposed to be part of "shipmas" but not only was o3 not shipped, it was announced with only benchmarks and not even a live demonstration.... the only reason Altman would do that is because he had something to hide and was in fake it until make it mode, otherwise he would be bragging with real live examples... instead of just a rigged chart/ fake benchmark etc... and they even told the ARC AGI dude to not disclose how much compute was used and the hardware/cost lol.... probably millions of dollars for a fake horse and pony show

And after I purchased the o1 pro mode for $200, I really expected, wanted and hoped that pro version of SORA would be way better than Kling, but in all my tests, its worse than Kling...

Are you using the free or paid version of Kling?
 

tphuang

Lieutenant General
Staff member
Super Moderator
VIP Professional
Registered Member
from their paper, training took 2.788M H800 GPU hours. For 2 months training run it's roughly only 2000 H800
Please, Log in or Register to view URLs content!
View attachment 141574
Yes, I saw that, but still quite unbelievable to do this.

My point is that it's in Deepseek's interest and China's interest to not fully acknowledge just how many GPUs they have or are using to training. So, I do put a Question mark in stuff like this.
 
Top