Artificial Intelligence is becoming core business infrastructure, supported by an ecosystem of models that are large and small, open and proprietary, generalist and specialist. NVIDIA positioned that diversity as essential to a future where applications, companies and countries all build and use Artificial Intelligence. Jensen Huang described the direction as a combination rather than a contest, arguing that proprietary and open models will coexist as part of the same innovation stack.
NVIDIA said the next phase of development will not be defined by a single massive model. Industries including healthcare, finance and manufacturing need systems of models tailored to different domains, data types and workflows. At GTC, the company announced the NVIDIA Nemotron Coalition, a global collaboration of model builders and labs focused on advancing open frontier foundation models through shared expertise, data and compute. NVIDIA is now the largest organization on Hugging Face, with nearly 4,000 team members. The first project stemming from the coalition will be a base model codeveloped by Mistral AI and NVIDIA. It’ll be shared with the open ecosystem and underpin the next generation of NVIDIA Nemotron models, which have been downloaded more than 45 million times from Hugging Face.
Panelists from companies including LangChain, Thinking Machines Lab, Perplexity, Cursor, Reflection AI, Mistral, OpenEvidence, Ai2, Black Forest Labs and AMP PBC highlighted several themes. One was that Artificial Intelligence agents are becoming capable enough to act as coworkers that can handle complex tasks over long periods. Another was that useful deployment depends on orchestration across multimodal, multi-model and multi-cloud systems, allowing users to hand off tasks without choosing the underlying model themselves.
Openness emerged as a central requirement for innovation, trust and access. Participants argued that open systems make it easier to study model behavior, advance scientific research and broaden participation beyond a small number of major labs. Open ecosystems were also presented as a way to democratize access to Artificial Intelligence and align contributors around broadly shared infrastructure, while proprietary components remain important for differentiated products and enterprise workflows.
Speakers also emphasized the need for both generalist and specialist systems. Specialized Artificial Intelligence is gaining importance because organizations can combine open foundation models with proprietary data to create domain-specific value. That structure, they argued, mirrors how expertise works in fields such as medicine, where generalists and specialists operate together. The result is a model landscape in which frontier, open and on-device systems all have roles, and where open components are expected to remain part of every major frontier.
