DeepSeek Artificial Intelligence: what to know about the ChatGPT rival

DeepSeek´s R1 is an open-source Artificial Intelligence large language model that has quickly displaced ChatGPT on the App Store and claims benchmark wins versus OpenAI´s o1, while offering a lower-cost API and a small reported hardware footprint.

In a matter of days after its release, DeepSeek´s R1 large language model has become a focal point in the Artificial Intelligence conversation, topping Apple´s App Store as the number one free app and generating market turbulence. The company released R1 as an open-source model, and Mashable reports the launch coincided with headlines about R1 displacing ChatGPT on the App Store, creating investor reaction and renewed debate about global competition in Artificial Intelligence.

DeepSeek published a report claiming R1 outperformed OpenAI´s reasoning model o1 on several advanced math and coding benchmarks, including AIME 2024, MATH-500 and SWE-bench Verified. The company said R1 scored just below o1 on Codeforces and performed near o1 on graduate-level science and general knowledge tests (GPQA Diamond and MMLU). Mashable´s Stan Schroeder tested R1 by asking it to code a fairly complex web app that parsed public data and built a dynamic travel and weather site, and reported being impressed with the model´s capabilities. The article notes other competitive LLMs exist, such as Anthropic´s Claude, Meta´s Llama family and Google´s Gemini, but frames R1´s combination of performance and other attributes as a strong challenge to established models.

DeepSeek emphasized openness and cost as differentiators. Because R1 is open source, programmers can inspect and modify the model, which advocates say helps scale and democratize Artificial Intelligence work. The model is available via a free web app at chat.deepseek.com and an API that DeepSeek says is significantly cheaper than OpenAI´s o1. The article lists pricing as ?.14 per one million cached input tokens for DeepSeek´s reasoning model versus ?.50 per one million cached input tokens for o1, with the currency symbol not stated in the piece.

For industry observers, the other headline-grabbing claim is resource efficiency. Citing DeepSeek engineers and reporting in The New York Times, the article says R1 required only 2,000 Nvidia chips to train, compared with a reported 10,000 Nvidia GPUs for OpenAI models in 2023. That alleged efficiency contributed to a 13 percent dip in Nvidia´s stock on the day. Whether R1 sustains user interest and developer momentum remains uncertain, but the release has clearly intensified competition and shifted attention across the Artificial Intelligence landscape.

78

Impact Score

Research excellence at the UF College of Medicine in 2025

In 2025, the University of Florida College of Medicine expanded its research footprint across cancer, neuromedicine, diabetes, and women’s and children’s health, leveraging artificial intelligence to accelerate discovery and clinical impact.

What EO 14365 means for state artificial intelligence laws and business compliance

Executive order 14365 signals a push toward a national artificial intelligence policy that could preempt certain state regulations without immediately changing existing compliance obligations. Businesses using artificial intelligence are advised to monitor forthcoming federal actions while continuing to follow current state laws.

Generative Artificial Intelligence reshapes europe’s economy, society and policy

The european commission’s joint research centre outlines how generative artificial intelligence is altering research, industry, labour markets and social equality in the EU, while highlighting gaps in patents, investment and safeguards. The report points to both productivity gains and rising risks that demand coordinated policy responses.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.