Indic Model: India’s Big AI Startup Sarvam Launches but Struggles to Get Users

Sarvam AI launches its Indic model Sarvam-M with $1B funding, but sees only 23 downloads in 2 days. Is India missing the real AI opportunity?

Introduction

In a country where over a billion people speak dozens of languages, building AI that understands them all should be a game-changer. That’s exactly what India’s most heavily funded AI startup, Sarvam AI, set out to do with the launch of Sarvam-M, a massive Indic language model.

But just days after its release, the excitement seems to have fizzled. Despite backing from big investors and a bold vision to serve India’s linguistic diversity, the model has seen only 23 downloads in two days — a surprisingly low number for such a high-profile launch.

The question now is: does India really need another large language model, or are we missing the real AI opportunity

/techovedas.com/sarvam-ai-selected-to-build-indias-own-large-language-model-for-a-digital-future

5 Key Points About Sarvam’s Indic Model Launch

Sarvam-M is a 24-billion parameter LLM, based on the Mistral Small architecture and trained on Indian language data.

Despite $1 billion in funding, the model saw only 23 downloads in two days after launch.

Korean open-source model Dia recorded more than 200,000 downloads in one month.

Sarvam-M performs slightly better than other models of similar size in Indic languages but lacks strong demand.

Experts say much of India’s AI effort focuses on “cool AI things” instead of solving important, real problems.

/techovedas.com/ai-powered-leadership-10-key-actions-for-success-in-2025

Background: What Is Sarvam-M and Why It Matters

Sarvam AI is one of India’s most valuable AI startups. It raised nearly $1 billion to build advanced AI tools focused on Indian languages like Hindi,

Tamil, Telugu, and others. Sarvam-M is a large language model with 24 billion parameters — this means it has a huge number of internal settings allowing it to understand and generate text.

The model is built on a popular open-source architecture called Mistral Small and then fine-tuned on a massive amount of Indic language data.

The goal was to create a model that understands Indian languages better than existing models that mostly focus on English.

techovedas.com/e70-billion-investment-eib-unveils-tech-eu-plan-to-propel-europes-ai-and-chip-sector

Disappointing Launch Numbers

After the launch, the AI and developer community expected Sarvam-M to quickly gain traction.

However, data from public platforms like Hugging Face and GitHub show the model was downloaded only 23 times within 48 hours. This number is tiny compared to expectations.

techovedas.com/e70-billion-investment-eib-unveils-tech-eu-plan-to-propel-europes-ai-and-chip-sector

How Korean Students’ Model Dia Stole the Spotlight

At the same time, two Korean college students developed Dia, an open-source LLM with around 13 billion parameters.

Even though smaller than Sarvam-M, Dia focuses on multilingual support and has reached over 200,000 downloads in just one month.

ModelParametersFocusDownloadsDevelopers
Sarvam-M24BIndic23 (2 days)Sarvam AI ($1B)
Dia~13BMultilingual200,000 (1 month)2 Korean students

This comparison shows that bigger models or bigger funding don’t always guarantee user adoption or impact.

techovedas.com/top-5-resources-on-large-language-models-llms/#google_vignette

Why Are Indic Models Important?

India has more than 1.4 billion people and 22 official languages. Many Indians speak languages rarely covered by popular AI tools like ChatGPT or Google Bard.

Creating strong Indic models is vital for bridging the digital divide and making AI accessible to millions who don’t speak English.

Applications for Indic models include:

  • Local language education tools
  • Healthcare chatbots in rural areas
  • Voice assistants for government services
  • Legal document analysis in regional languages

But Sarvam-M seems to miss the mark by focusing on improving numbers rather than creating clear use cases.

The Bigger Problem: Chasing Hype Over Real Solutions?

Experts say many Indian AI startups try to build large language models just to join the global AI race. But they often neglect real, hard problems unique to India.

We don’t need a slightly better 24-billion parameter model,” said an AI researcher based in Bengaluru. “We need tools that solve real problems in Indic languages — like helping farmers, doctors, or local governments.

Sarvam-M’s performance shows only marginal improvements on Indic benchmarks like IndicGLUE, but it hasn’t convinced users or developers to adopt it yet.

Sure! Here’s a concise version of the probable reasons section:


Why Sarvam’s Indic Model Struggled: Key Reasons

  1. No Clear Use Cases: The model doesn’t solve urgent, real-world problems that users or developers need.
  2. Poor Developer Support: Limited tools and guidance made it hard for developers to adopt Sarvam-M.
  3. Strong Open-Source Competition: Smaller, easier-to-use models like Korea’s Dia gained much more attention.
  4. Overhyped but Underwhelming: Sarvam-M offers only small improvements, failing to excite the AI community.
  5. Not Focused on Local Needs: The model doesn’t address practical challenges in rural or regional India.

What This Means for Indian AI

India’s AI ecosystem has great potential, especially for Indic models. But funding alone won’t solve the challenges. Indian startups must focus on:

  • Building AI that solves clear, local problems
  • Collaborating with communities to understand their needs
  • Making AI tools easy to use and accessible

Otherwise, big launches with little adoption will become the norm.

Follow us on Twitter: https://x.com/TechoVedas

Conclusion: India’s AI Future Needs Focused Indic Models

Sarvam AI’s $1 billion investment and Sarvam-M launch were meant to boost India’s position in AI.

But the low adoption numbers show that Indian AI startups must rethink their strategies. The future belongs to Indic models that solve real problems, not just models that look good on paper.

Contact @Techovedas for guidance and expertise in Semiconductor domain

Kumar Priyadarshi
Kumar Priyadarshi

Kumar Joined IISER Pune after qualifying IIT-JEE in 2012. In his 5th year, he travelled to Singapore for his master’s thesis which yielded a Research Paper in ACS Nano. Kumar Joined Global Foundries as a process Engineer in Singapore working at 40 nm Process node. Working as a scientist at IIT Bombay as Senior Scientist, Kumar Led the team which built India’s 1st Memory Chip with Semiconductor Lab (SCL).

Articles: 2965

For Semiconductor SAGA : Whether you’re a tech enthusiast, an industry insider, or just curious, this book breaks down complex concepts into simple, engaging terms that anyone can understand.The Semiconductor Saga is more than just educational—it’s downright thrilling!

For Chip Packaging : This Book is designed as an introductory guide tailored to policymakers, investors, companies, and students—key stakeholders who play a vital role in the growth and evolution of this fascinating field.