India emerges as the largest market for large language model platforms

A Bofa Securities report finds that India has become the largest market worldwide for large language model platforms such as ChatGPT, Gemini and Perplexity, driven by telco bundles, cheap data and a young, English speaking population.

A report by Bofa Securities says that the growing use of platforms such as ChatGPT, Gemini and Perplexity has made India the world’s largest market for large language model adoption. The brokerage calls India the ideal test bed for agentic artificial intelligence applications built on top of large language models that can reason, plan and execute tasks independently. At the same time, the report cautions that the rapid rise of global platforms could negatively affect the domestic startup ecosystem before local models have a chance to mature or scale.

The report details how usage is breaking records across leading services. The number of ChatGPT downloads peaked at over 24 million in July before falling a bit and plateauing at 24 million once OpenAI zero-cost subscriptions, while Airtel’s bundled pack helped Perplexity peak at over 20 million downloads in October, the report said. ChatGPT is described as the leading platform among large language models in India with around 145 million monthly active users and Gemini comes second at 105 million, it said. On a daily basis, 65 million Indians use the OpenAI-led ChatGPT, while Gemini comes second at 15 million daily average users. Over 16 per cent of ChatGPT’s global base is from India, making the country as the largest market for the platform, the Bofa Securities report said. The report adds that India is also the largest market for peers such as Google Gemini, which gets 30 per cent of its monthly average users from India, and Perplexity, which derives 38 per cent of monthly average users from the country.

Bofa Securities attributes this scale in part to complimentary subscriptions to large language model applications bundled by domestic telecom operators. Other factors, including over 700 million mobile users, access to affordable data plans allowing a user to consume over 20 GB of data by paying just USD 2 a month, and a young and English-speaking population, also helped, it said. The brokerage argues that adoption at scale benefits consumers, who can boost productivity, learning and cross language communication thanks to support for regional languages, as well as global platforms, which gain access to vast, diverse data to refine models and improve handling of multilingual and culturally nuanced queries. Telecom operators are also seen as beneficiaries, with potential to expand average revenue per user, create upselling opportunities and pursue datacenter collaborations with global firms. However, the report warns that the network effects of global large language models and their agentic artificial intelligence wrappers may allow them to dominate local demand in a way that could limit opportunities for Indian startups, drawing a parallel with how Meta and Google came to dominate social and video markets through services such as Facebook and YouTube.

62

Impact Score

SK Group warns DRAM shortages could curb memory use

SK Group chairman Chey Tae-won warned that customers may reduce memory consumption through infrastructure and software optimization if DRAM suppliers fail to raise output. Demand from Artificial Intelligence data centers is keeping the market tight as memory makers weigh expansion against the long timelines for new fabs.

BitUnlocker bypasses TPM-only Windows 11 BitLocker

Intrinsec disclosed BitUnlocker, a downgrade attack that can bypass TPM-only Windows 11 BitLocker protections with physical access to a machine. The technique abuses a flaw in Windows recovery and deployment components and relies on older trusted boot code.

Micron samples 256 GB DDR5 9200 MT/s RDIMM server modules

Micron has begun sampling 256 GB DDR5 RDIMM server modules built on its 1-gamma technology to key ecosystem partners. The company positions the new modules as a higher-speed, more power-efficient option for scaling next-generation Artificial Intelligence and HPC infrastructure.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.