New Delhi: Sarvam and Adya.ai are two of India’s newest startups targeting the red-hot generative artificial intelligence market, but they are also among a new breed of ventures looking beyond large language models like ChatGPT. The reason: industry-specific applications are easier to monetise and help avoid competition with AI giants.
For Sarvam, seeking to be a full-stack generative AI firm, the approach is two-pronged. The startup, which raised $41 million in its first funding round from venture capital firms Lightspeed and Peak XV last year, is building vast large language models (LLMs) capable of understanding and processing human languages. It then uses these LLMs to build marketable products such as ‘Agents’ that are trained on domain-specific datasets, gaining expertise in fields such as healthcare, financial services and law.
“In healthcare, for instance, a generative AI assistant can offer neonatal advice to pregnant women at a cost of around ₹1—no physical doctor’s time can be this affordable,” Pratyush Kumar, cofounder of Sarvam and adjunct faculty at Indian Institute of Technology (IIT), Madras, told Mint last month. “Instead of taking clientele away from doctors, in India, such domain-specific AI models can make healthcare more accessible to a wide population base.”
The approach Sarvam, Adya.ai and some of their peers are taking marks a sobering change after an initial AI euphoria to create general purpose generative AI models. Competing with Microsoft-owned OpenAI’s ChatGPT or Google’s Gemini will not only be expensive but tough given the headstart they have. AI applications solving smaller problems offer a better chance at success.
“It’s important for startups to solve specific, targeted problems,” said Ankush Sabharwal, cofounder of CoRover, which has created BharatGPT. “Even within healthcare, the nature of services needed for metro markets is different from that in villages. Because of this, taking a domain-specific AI model approach is key—building ‘India-focused’ models is also too large, and Big Tech is too big a competitor for this.”
Adya.ai, which raised $1.2 million in a pre-Series A funding round from the Indian Angel Network collective, is training its models to power ready-to-deploy AI assistants for e-commerce and retail companies to serve as customer service agents, said Shayak Mazumder, chief executive officer at the startup. “This is only the first domain that we’re targeting, and we’ll expand to more domains in future.”
According to Kashyap Kompella, technology analyst and founder of tech consultancy RPA2AI, building domain-specific AI applications and sub-models for enterprises “is the ideal sweet spot from development cost, market opportunity and monetization approaches.”
He cited parallels with the evolution of the IT industry where several companies built major expertise in applications deployed on platforms such as SAP and Salesforce. “The reason for this is that enterprise applications on such platforms are difficult to build, require experienced talent to be hired, and are long-term projects,” he said. “The story is somewhat similar for generative AI.”
This approach is helping India’s early-stage startups earn revenue from their generative AI projects. CoRover, for instance, expects annual revenue to close in on $5 million by FY25. Both Sarvam and Adya.ai also have paying enterprise clients.
New LLM-based generative AI applications are seen as a natural evolution to early renditions of chat automation. Startups such as the Bengaluru-headquartered Yellow.ai and Pune-based E42.ai have built conversational agents for over five years now. With generative AI, the likes of Sarvam and CoRover are offering more interactive and less expensive AI agents in local Indian languages, which can help businesses expand customer support to multiple languages.
Building sector-specific solutions, however, has its own challenges. CoRover’s Sabharwal said, “We do not have enough targeted data for each and every domain,” he said. “Because of this, we’re running a platform where enterprise customers can choose among BharatGPT, OpenAI’s GPT family and Google’s Gemini as the foundational models, and build virtual assistants based on any model as deemed fit. The data for the specific domain is brought by the client itself.”
According to Sarvam’s Kumar, while there is lack of data, especially in Indic languages, the problem has a solution, too. “There are various publicly accessible data sources that we use to collate data and operate AI models. The cost of running domain-specific AI models is not the problem,” he said. “The only issue is that this will take some time to become mature, and be fully deployable within the domains with lesser and lesser margins for errors.”