
Ola-founder Bhavish Aggarwal on Tuesday announced the launch of Krutrim-2, a homegrown large language model that claims to offset limitations with Indian languages.
The billionaire businessman also announced the launch of BharatBench, a "Comprehensive, Multimodal, Multilingual, Multi-task Indic benchmark" that tests how well AI models work with Indian languages.
"While we've been working on AI for a year, today we're releasing our work to the open source community and also publishing a bunch of technical reports.
"Our focus is on developing AI for India - to make AI better on Indian languages, data scarcity, cultural context etc," Aggarwal said on X, publishing several links leading to papers on the latest launches.
Krutrim-2 comes in the wake of Krutrim-1 that was launched in January 2024 with 7 billion parameters.
According to a paper, Krutrim-2 is a 12 billion parameters model built on the Mistral-NeMo architecture and supports English and 22 Indic languages.
"The model delivers best-in-class performance across Indic tasks and a promising performance on English benchmarks equivalent to models 5-10x the size," it read.
Aggarwal also announced there would by March be India's first deployment of the GB200 GPU built by Nvidia. He also claimed the company will have built India's largest supercomputer by the end of the year.
Also announced was a Rs 1,000 crore, or $230 million, fresh funding into Krutrim, a Softbank-backed startup.
Aggarwal claimed that the company will have a total investment of Rs 10,000 crore, or $1.15 billion, by next year.
Bhavish Aggarwal. Image Source: Inakashsingh
Comments