Alibaba Unveils Qwen3: New Standard in Open-Source Large Language Models

Alibaba launches Qwen3, a groundbreaking open-source large language model family, pushing Artificial Intelligence innovation with hybrid reasoning and multilingual support.

Alibaba has introduced Qwen3, its latest generation of open-sourced large language models, establishing a new benchmark in Artificial Intelligence innovation. Qwen3 comprises six dense models and two Mixture-of-Experts (MoE) models, with parameter scales ranging from 0.6 billion to 235 billion, now freely accessible worldwide. Developers can leverage these models for diverse applications spanning mobile devices, smart glasses, autonomous vehicles, and robotics. All Qwen3 models are open sourced and available on platforms such as Hugging Face, GitHub, and ModelScope, ensuring broad developer access and fostering global collaboration.

Qwen3 pioneers Alibaba´s debut in hybrid reasoning models, uniting traditional large language model capabilities with advanced dynamic reasoning. The models are engineered to switch flexibly between ´thinking´ mode for complex, multi-step tasks—such as mathematics, coding, and logical deduction—and ´non-thinking´ mode for fast, general-purpose outputs. For API users, Qwen3 provides granular control over the duration of its reasoning (up to 38,000 tokens), optimizing performance while containing computational costs. The flagship model, Qwen3-235B-A22B MoE, notably reduces operational expenses compared to other state-of-the-art models, reaffirming Alibaba´s commitment to affordable, high-performance Artificial Intelligence.

The Qwen3 suite is trained on an expansive dataset of 36 trillion tokens, twice that of its predecessor, resulting in significant advancements in reasoning, instruction following, tool use, and multilingual tasks. Key features include superior support for 119 languages and dialects, robust agent-task integration through native Model Context Protocol and function-calling, leading benchmark scores in mathematics and coding, and enhanced human alignment for natural dialogue and creative applications. The models achieved top-tier results across industry benchmarks including AIME25, LiveCodeBench, BFCL, and Arena-Hard, driven by a complex four-stage training process focused on reinforcement learning and reasoning fusion.

Open access is central to Qwen3´s release, as Alibaba aims to accelerate Artificial Intelligence innovation across industries. Since inception, the Qwen model family has recorded over 300 million downloads worldwide, with more than 100,000 derivative models created by the developer community. Qwen3 already underpins Alibaba´s Artificial Intelligence super assistant app, Quark, and will soon be available via its Model Studio platform. This comprehensive open-source approach signals Alibaba´s ambition to redefine the global landscape of large language models and hybrid Artificial Intelligence solutions.

79

Impact Score

European Union moves to streamline and tighten Artificial Intelligence rules

The European Union is advancing parallel efforts to simplify parts of its Artificial Intelligence rulebook while moving toward tougher restrictions on tools used to create non-consensual sexual content. The latest steps combine broader regulatory streamlining with targeted action against harmful image and audio generation systems.

Y Combinator machine learning startups in 2026

Y Combinator’s 2026 machine learning directory highlights a broad mix of startups spanning infrastructure, robotics, healthcare, developer tools, data systems, and enterprise software. The list shows how deeply Artificial Intelligence and machine learning are being applied across industrial, scientific, and business workflows.

Artificial Intelligence platform choice becomes a board decision

Competition among leading Artificial Intelligence providers is shifting from model benchmarks to control of broader platforms and ecosystems. That change turns large language model selection into a long term strategic decision for boards, not just engineering teams.

Memory makers see shortages easing in late 2028

Memory manufacturers expect shortages to continue until the end of 2028, with supply and demand returning to balance afterward. Producers are also reassessing whether to extend capacity expansion beyond plans already tied to current demand.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.