Three young founders, one bold llm: how HelpingAI is rewriting India’s Artificial Intelligence playbook

HelpingAI, founded by three Indian technopreneurs, has built Dhanishtha, a token-efficient, emotionally aware llm that its creators say cuts latency and inference cost to accelerate Artificial Intelligence adoption in India.

HelpingAI was founded by three Indian technopreneurs in their late teens and early twenties — Nishith Jain (CEO), Varun Gupta (CTO), and Abhay Koul (chief Artificial Intelligence officer). The team says it bootstrapped early research and pooled technical talent to tackle three specific barriers to local Artificial Intelligence adoption: token inefficiency, empathy gaps in conversational models, and latency on low-infrastructure networks. Their flagship model, Dhanishtha, is framed as an engineering response to those constraints and a culturally aware alternative to Western llms for Indian enterprises and service sectors.

Dhanishtha departs from conventional transformer-only workflows by parallelizing reasoning and generation with what the company calls intermediate reasoning, or mid-response processing. The model uses traceable blocks for reasoning and blocks for sentiment-adaptive replies, which HelpingAI says supports a claimed 98.13 emotional intelligence score and improves traceability. Architecturally, Dhanishtha ships in scalable sizes from about 2 billion to 15 billion parameters, designed to allow mobile inference at the small end and enterprise workloads at the large end. HelpingAI lists training sources as multilingual web data, code repositories, domain-specific corpora and emotionally tagged conversational data. Optimization techniques highlighted include token compression (up to 5× fewer tokens compared with some competitors), early stopping, gradient clipping, hyperparameter tuning, in-memory caching and auto-scaling cluster management. Benchmarks in the article claim Dhanishtha is roughly 4× faster than a competitor named DeepSeek in inference.

The startup positions Dhanishtha for enterprise automation, healthcare and telemedicine triage, adaptive education tools, retail personalization, SMB developer tools and other verticals where cultural and linguistic nuance matters. HelpingAI identifies three scaling challenges in India — limited high-end compute, scarce specialist talent and funding gaps beyond MVP — and outlines countermeasures: efficiency-first model design, community engagement for talent pipelines, domain partnerships for monetization and last-mile integration tools for non-technical adopters. The road map includes scaling parameter counts, adding vision and speech modules, ecosystem partnerships, domestic talent development and embedding visible reasoning steps for enterprise auditability. If the performance and efficiency claims hold at scale, HelpingAI argues Dhanishtha could become a credible homegrown rival to global llms in targeted Indian and emerging markets.

70

Impact Score

It, business leaders clash over cloud, data security

An Unisys survey of 1,000 C-suite and technology executives found organizations plan to increase cloud spending despite disappointing returns. Fewer than half of roughly 300 business executives said they were satisfied with ROI on cloud, generative Artificial Intelligence and automation, even as more than three quarters plan to boost cloud investment.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.

Please check your email for a Verification Code sent to . Didn't get a code? Click here to resend