DeepSeek soon on Indian IT servers, says Ashwini Vaishnaw
The initiative will be powered by the India AI Compute Facility, which has secured 18,000 GPUs to drive the development of a Large Language Model (LLM) tailored for the country’s needs.
January 30, 2025 / 12:25 IST
deepseek
Union IT Minister Ashwini Vaishnaw on January 30 said soon there will be Deep Seek on Indian servers.
"It is an open-source model. Like LLama, which is open source, this too can be hosted on Indian servers. Data privacy issues regarding DeepSeek can be addressed by hosting open source models on Indian servers," said Union IT Minister Ashwini Vaishnaw announced at the Utkarsh Odisha Conclave.
He also added that distillation is very important for AI models. We will be keeping our models open and application-focused.
The initiative will be powered by the India AI Compute Facility, which has secured 18,000 GPUs to drive the development of a Large Language Model (LLM) tailored for the country’s needs.
"Jio Platforms, CtrlS Datacenters Ltd, Locuz Enterprise Solutions, E2E Networks Limited, NxtGEn DataCenter to supply GPUS," Vaishnaw said.
Speaking on the cost of this facility, Vaishnaw said this is the most affordable compute facility. "This is coming at significantly less than 1 dollar, with cost being borne by the government. We will be able to provide subsidy for four years. The real value of AI models will come with algorithmic efficiency, secondly, quality of datasets," he said.
On GPU tenders, he said government is in talks with technological partners to bring their knowledge in managing a good data center, and bring GPUs, and make it available in a structure, open and accessible manner.
"Major chip designers are willing to work with India to develop indigenous GPUs. We will share more information soon on indigenous development of GPUs. The design ecosystem is developing nicely in India for new chip creation and IP. We are working on partnership on co-developing GPUs with MeitY. We will share more details on this in coming week," he added.