DeepSeek Artificial Intelligence: what to know about the ChatGPT rival

DeepSeek´s R1 is an open-source Artificial Intelligence large language model that has quickly displaced ChatGPT on the App Store and claims benchmark wins versus OpenAI´s o1, while offering a lower-cost API and a small reported hardware footprint.

In a matter of days after its release, DeepSeek´s R1 large language model has become a focal point in the Artificial Intelligence conversation, topping Apple´s App Store as the number one free app and generating market turbulence. The company released R1 as an open-source model, and Mashable reports the launch coincided with headlines about R1 displacing ChatGPT on the App Store, creating investor reaction and renewed debate about global competition in Artificial Intelligence.

DeepSeek published a report claiming R1 outperformed OpenAI´s reasoning model o1 on several advanced math and coding benchmarks, including AIME 2024, MATH-500 and SWE-bench Verified. The company said R1 scored just below o1 on Codeforces and performed near o1 on graduate-level science and general knowledge tests (GPQA Diamond and MMLU). Mashable´s Stan Schroeder tested R1 by asking it to code a fairly complex web app that parsed public data and built a dynamic travel and weather site, and reported being impressed with the model´s capabilities. The article notes other competitive LLMs exist, such as Anthropic´s Claude, Meta´s Llama family and Google´s Gemini, but frames R1´s combination of performance and other attributes as a strong challenge to established models.

DeepSeek emphasized openness and cost as differentiators. Because R1 is open source, programmers can inspect and modify the model, which advocates say helps scale and democratize Artificial Intelligence work. The model is available via a free web app at chat.deepseek.com and an API that DeepSeek says is significantly cheaper than OpenAI´s o1. The article lists pricing as ?.14 per one million cached input tokens for DeepSeek´s reasoning model versus ?.50 per one million cached input tokens for o1, with the currency symbol not stated in the piece.

For industry observers, the other headline-grabbing claim is resource efficiency. Citing DeepSeek engineers and reporting in The New York Times, the article says R1 required only 2,000 Nvidia chips to train, compared with a reported 10,000 Nvidia GPUs for OpenAI models in 2023. That alleged efficiency contributed to a 13 percent dip in Nvidia´s stock on the day. Whether R1 sustains user interest and developer momentum remains uncertain, but the release has clearly intensified competition and shifted attention across the Artificial Intelligence landscape.

78

Impact Score

Artificial intelligence is coming for YouTube creators

More than 15.8 million YouTube videos from over 2 million channels appear in at least 13 public data sets used to train generative Artificial Intelligence video tools, often without creators’ permission. creators and legal advocates are contesting whether such mass downloading and training is lawful or ethical.

Netherlands issues new Artificial Intelligence Act guidance

Businesses in the Netherlands have been given updated guidance on how the new EU-wide Artificial Intelligence Act will affect them. The 21-page guide, available in English as a 287KB PDF, sets out practical steps to assess scope and compliance obligations.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.