Artificial Intelligence thread

KYli

Brigadier
Nvidia believes that the decline in sales in China can be offset by demands in other regions as there is still shortage of AI chips. It is possible that in the short term Nvidia won't be affected that much but losing the second biggest market would surely hurt Nvidia especially if such action could nurture a new competitor to compete with Nvidia globally.
Please, Log in or Register to view URLs content!

“We expect that our sales to these destinations will decline significantly in the fourth quarter of fiscal 2024, though we believe the decline will be more than offset by strong growth in other regions.”
 

BlackWindMnt

Captain
Registered Member
Nvidia believes that the decline in sales in China can be offset by demands in other regions as there is still shortage of AI chips. It is possible that in the short term Nvidia won't be affected that much but losing the second biggest market would surely hurt Nvidia especially if such action could nurture a new competitor to compete with Nvidia globally.
Please, Log in or Register to view URLs content!

“We expect that our sales to these destinations will decline significantly in the fourth quarter of fiscal 2024, though we believe the decline will be more than offset by strong growth in other regions.”
I can see this working in the short term, maybe even medium term, but what if Huawei also starts exporting their AI hardware and frameworks. If you think about it Huawei does have China's influence in the BRICS markets.

I could see India become a potential big market in maybe the long term for nvidia. But im not that familiar with how Indian law/companies do their data storage and retrievability for AI model training. Wouldn't actually be surprised a lot of the Indian market data might already be in the hands of western big tech. So this is just pure speculation
 

tacoburger

Junior Member
Registered Member
Please, Log in or Register to view URLs content!

A chinese company that's focused on using A.I models for use in biotechnology

From the sequence and structure of proteins to the behavior of multicellular systems, we are developing AI Foundation Models to understand and predict the behavior of life at multiple scales of complexity

Our family of Pre-trained Large Language Models is called xTrimo, short for Cross-Modal Transformer Representation of Interactome and Multi-Omics. xTrimo is trained on our curated and proprietary datasets, which include more than 6 billion proteins, 100 billion protein-protein interactions, and trillions of single-cell gene expression measurements from 100+ million cells. Model training is enabled by our world-leading super-computing center and enhanced by our AI-centric, 100,000 sq ft, high-throughput wet labs. xTrimo is the first life science AI Foundation Model to hit 100+ billion parameters and, to date, the largest of its kind.
Traditional AI methods require high quantities of labeled data to make accurate predictions. However, in frontier life science tasks, these labeled data are often in short supply. BioMap is revolutionizing the landscape by enabling one Foundation Model to inform multiple downstream task models with limited or even zero data, facilitating generative AI. We are pushing the boundaries of task models across a diverse range of applications, including therapeutic antibodies, industrial enzymes, and denovo designed proteins.
Please, Log in or Register to view URLs content!

They just entered a partnership with a huge french pharmaceutical company.
 

Overbom

Brigadier
Registered Member
Huge if this is indeed the case. This is at least a 40x faster inference. Would be revolutionary if we could achieve that

I think its only for CPU not GPU

Please, Log in or Register to view URLs content!

Exponentially Faster Language Modelling​

Language models only really need to use an exponential fraction of their neurons for individual inferences. As proof, we present UltraFastBERT, a BERT variant that uses 0.3% of its neurons during inference while performing on par with similar BERT models. UltraFastBERT selectively engages just 12 out of 4095 neurons for each layer inference. This is achieved by replacing feedforward networks with fast feedforward networks (FFFs).
While no truly efficient implementation currently exists to unlock the full acceleration potential of conditional neural execution, we provide high-level CPU code achieving 78x speedup over the optimized baseline feedforward implementation, and a PyTorch implementation delivering 40x speedup over the equivalent batched feedforward inference. We publish our training code, benchmarking setup, and model weights.
 
Last edited:

BlackWindMnt

Captain
Registered Member
Please, Log in or Register to view URLs content!

OpenAI ‘was working on advanced model so powerful it alarmed staff’​

Reports say new model Q* fuelled safety fears, with workers airing their concerns to the board before CEO Sam Altman’s sacking

Does anyone know if there are any papers available on Q*?
This is sounds like pure bullshit, you had the same thing going on with deep learning paradigm. Deep learning institution would create skynet any moment. 6 years later no one is talking about deep learning because crypto took over VC stuff.

This openai circus feels a lot like active running psyops like you this spring summer UFO circus.
 

Overbom

Brigadier
Registered Member
Please, Log in or Register to view URLs content!

OpenAI ‘was working on advanced model so powerful it alarmed staff’​

Reports say new model Q* fuelled safety fears, with workers airing their concerns to the board before CEO Sam Altman’s sacking

Does anyone know if there are any papers available on Q*?
There are papers for Q* but the problem is that we don't know if OpenAi's Q* algorithm is that Q* algorithm and not anything else.

Until we have confirmation that they are the same thing, it is meaningless to talk about the papers and the tech.
 

BoraTas

Captain
Registered Member
This is sounds like pure bullshit, you had the same thing going on with deep learning paradigm. Deep learning institution would create skynet any moment. 6 years later no one is talking about deep learning because crypto took over VC stuff.

This openai circus feels a lot like active running psyops like you this spring summer UFO circus.
Don't forget the nanotech trend before that. And "genetics revolution" and hype around 3D printing preceding that.

"AI safety" is just some bureaucrats trying to gain more regulatory powers and the usual sensationalism by the media. I hope they won't regulate the field into stagnation, like they did in several other fields.
 

SDtom

New Member
Registered Member
There are papers for Q* but the problem is that we don't know if OpenAi's Q* algorithm is that Q* algorithm and not anything else.

Until we have confirmation that they are the same thing, it is meaningless to talk about the papers and the tech.

I hope the Chinese government and corporations doesn't wait till confirmation before they start pushing harder toward general AI. This will have as much, if not greater, impact then the first industrial revolution. The comparative differences between countries that have it and don't have it is like one using modern day calculator to the one using abacus. Yes, there are many groups in China are making new models and model with higher parameters and stuff but it seems like there is a lack of leaders/visionaries which can harness and use all these tools to produce globally impactful/leading products.
 
Top