I just returned from a business trip, and my development is mainly focused on embedded environments, as well as tinkering with NPU AI applications. The solutions I have been exposed to include
rk3399/rk3568/rk3576/rk3588/ax630/ax650(1-18TOPS int8)
What I am doing in the future will basically involve the targeted application of all domestic AI SOC, including those discussed by the general public and those not discussed
I haven't had much exposure to AI SOC now, purely because I don't have the money to buy too many development boards and carry out development (I am currently working on mass production customization of AI hardware, with an estimated investment of 120000+USD), because I will add NPU/AI functionality to a specific customized Linux solution. It is 10 times more troublesome than on official custom Linux distributions (usually Ubuntu/debian/armbian).
I'm no amateur in AI hardware/software, though my projects primarily use the YOLO series—not LLM-type models. Truth is, current-generation NPUs like those in the RK33/RK35 series SOCs aren’t well-suited for LLMs.
It’s the next-gen RK36 series that’s designed for it.
I respect the information shared by the person in the video, though I wouldn’t consider them an expert either. That said, their insights are particularly intriguing.
I’ve engaged with acknowledged heavyweights in China’s internet circle—many now diving into AI. Technically, I’m confident we have a far clearer grasp of the underlying tech than these industry leaders.
But commercially speaking, early-stage innovation ventures’ survival—fundamentally reliant on a CEO’s fundraising prowess—means individuals like those in the video prove indispensable in the commercialization process.