Artificial Intelligence thread

tphuang

Lieutenant General
Staff member
Super Moderator
VIP Professional
Registered Member
Please, Log in or Register to view URLs content!
Fudan university to cooperate will Alibaba to create AI fro Science. They onboarded "Computing for the Future at Fudan" yesterday. Fudan developed China's first conversational language large model MOSS last year

Please, Log in or Register to view URLs content!
iFlyTek announced a 1+4 strategy for AI in ASEAN with Singapore as 1 and Indonesia/Vietnam/Malaysia/Thailand as 4. iFlyTek has 120k developers internationally with 40k in southeast aisa

Please, Log in or Register to view URLs content!
China mobile starting metaverse industry alliance with Huawei, Xiaomi & iFlyTek as initial members

China mobile aims to have 20 EFLOPS of computation by 2025
 

BlackWindMnt

Captain
Registered Member
Does anyone knows if there are programs in China to mesh AI into their military sensors like radar, sonar, FLIR etc. I'm sure the Americans are doing that already.
I would be surprised if they didn't do it people have been doing that for a decade plus right now with self driving vehicles.

If it works for self driving vehicle putting target/object recognization on a missile for the last mile of the kill chain only makes sense to me.
 

gelgoog

Brigadier
Registered Member
It is pretty well known that China is using AI to analyze the images taken by their ER satellites. This has been shown to be capable to identify airplanes on the ground and ships at sea. Quite likely they are also applying such technology to radar and sonar classification.

The Russian Su-35 has AI where it analyzes the radar signature of opposing aircraft, identifies, classifies those aircraft, and suggests the pilot how to act. I would be surprised if China did not have such technology either in service or in development. Russia has been working on artificial neural networks at least since the late 1990s. If anything, the Su-57 has even more AI compute power than the Su-35. China has way more expertise than the Russians in chip manufacturing.
 
Last edited:

Eventine

Junior Member
Registered Member
As English is the global language - unfortunately - I think Chinese LLM builders will still need to prove their capabilities with English, rather than Chinese, to get a larger piece of the global pie that currently US companies are dominating. But perhaps there is an alternative strategy if Chinese companies can get strong at localization - there's a huge market for local language LLMs that US companies aren't particularly great at.
 

luminary

Senior Member
Registered Member
From
Please, Log in or Register to view URLs content!
and
Please, Log in or Register to view URLs content!
Anil Dash:
Today’s highly-hyped generative AI systems (most famously OpenAI) are designed to generate bullshit by design. To be clear, bullshit can sometimes be useful, and even accidentally correct, but that doesn’t keep it from being bullshit. Worse, these systems are not meant to generate consistent bullshit — you can get different bullshit answers from the same prompts. You can put garbage in and get… bullshit out, but the same quality bullshit that you get from non-garbage inputs! And enthusiasts are current mistaking the fact that the bullshit is consistently wrapped in the same envelope as meaning that the bullshit inside is consistent, laundering the unreasonable-ness into appearing reasonable.

Now we have billions of dollars being invested into technologies where it is impossible to make falsifiable assertions. A system that you cannot debug through a logical, socratic process is a vulnerability that exploitative tech tycoons will use to do what they always do, undermine the vulnerable.
Now that I've said that...

Meituan buys out AI start-up​

  • Beijing-based Light Year is among a slew of new Chinese AI start-ups to emerge amid the country’s ChatGPT-inspired frenzy
Meituan is buying out Light Year, an artificial intelligence (AI) start-up established by a co-founder and ex-director at the food delivery services giant, for a total consideration of $281 million, as Chinese Big Tech firms step up their interest in AI, reports the
Please, Log in or Register to view URLs content!
. The deal involves $233.7 million in cash and RMB 336.9 million ($47 million) in assumed liabilities. Upon completion of the acquisition, the Beijing-based company will hold 100% of Light Year, it said in a filing to the Hong Kong stock exchange on Thursday.
 

tokenanalyst

Brigadier
Registered Member

China Starts Using First Homegrown Genetic Model to Predict Heart Disease

China Starts Using First Homegrown Genetic Model to Predict Heart Disease


A Shanghai medical institution has begun using the country's first self-developed genetic testing model to assess people at risk of conditions that affect blood flow in the heart and the brain.
Under the leadership of Gu Dongfeng, an academician of the Chinese Academy of Sciences, Ping An Healthcare Diagnostics Center, a medical screening affiliate of Ping An Insurance Co. of China, and Boke Bioscience, a DNA sequencing startup, launched the application of MetaPRS, an evaluation model to predict cardiovascular and cerebrovascular diseases such as coronary heart disease, stroke, and aneurysms, Yicai Global learned during the recent launch ceremony.
The research team formed the model that covers over 600 genetic variations which are relevant to cardiovascular and cerebrovascular diseases by using data from more than 120,000 persons from 15 provincial-level regions in China over the past two decades.


Please, Log in or Register to view URLs content!
 

tokenanalyst

Brigadier
Registered Member


A new symbolic memory framework, ChatDB, was proposed by Zhao Xing's research group at the Institute of Cross-Information Research​


Tsinghua News, June 29th. Recently, researchers from the research group of Assistant Professor Zhao Xing of the Institute of Interdisciplinary Information, Tsinghua University and their cooperative units proposed a new symbolic memory framework, ChatDB, which breaks through the previously commonly used memory framework for storage. Imprecise information operation, lack of structure in the form of historical information storage and other limitations.
20230628- ChatDB Research Paper-Screenshot- 01.PNG

Figure 1. ChatDB workflow diagram
ChatDB consists of a large language model (such as ChatGPT) and a database, which can use symbolic operations (ie SQL instructions) to achieve long-term and accurate recording, processing and analysis of historical information, and help respond to user needs. Its framework consists of three main stages: input processing, chain-of-memory, and response summary. In the first stage, LLMs process user input requirements, and directly generate replies for commands that do not involve the use of database memory modules; and generate a series of SQL statements that can interact with database memory modules for commands that involve memory modules. In the second stage, the memory chain performs a series of intermediate memory operations and interacts with symbolic memory modules. ChatDB performs operations such as insert, update, select, and delete in sequence according to the previously generated SQL statements. The external database executes the corresponding SQL statement, updates the database and returns the result. Before executing each memory operation, ChatDB will decide whether to update the current memory operation according to the results of previous SQL statements. In the third stage, the language model synthesizes the results obtained by interacting with the database, and makes a summary reply to the user's input.
20230628- ChatDB Research Paper-Screenshot-02.PNG

Figure 2. ChatDB framework overview

In order to verify the effectiveness of using the database as a symbolic memory module in ChatDB to enhance the effectiveness of large language models, and to make quantitative comparisons with other models, the researchers constructed a synthetic dataset of fruit shop operations and management, and named it "Fruit Shop Dataset”, which contains 70 store records generated in chronological order, with about 3300 tokens (less than ChatGPT’s maximum context window length of 4096). These records contain four common operations for fruit shops: purchase, sale, price adjustment, and return. The LLM module in the ChatDB model uses ChatGPT (GPT-3.5 Turbo), the temperature parameter is set to 0, and the MySQL database is used as its external symbolic memory module. The baseline model for comparison is ChatGPT (GPT-3.5 Turbo), the maximum context length is 4096, and the temperature parameter is also set to 0. The researchers conducted experiments on the fruit shop question answering dataset and found that ChatDB showed significant advantages in answering these questions compared to ChatGPT.

Recently, the achievement was published on ArXiv of Cornell University in the paper " ChatDB: Augmenting LLMs with Databases as Their Symbolic Memory " ( ChatDB: Augmenting LLMs with Databases as Their Symbolic Memory ) .

The co-first authors of the paper are Hu Chenxu, a doctoral student at the Institute of Interdisciplinary Information, Tsinghua University, and Fu Jie, a researcher at Zhiyuan Research Institute. The corresponding authors are Fu Jie and Zhao Xing, an assistant professor at the Institute of Interdisciplinary Information. Luo Simian, and Assistant Professor Zhao Junbo of Zhejiang University.

Please, Log in or Register to view URLs content!
 
Top