At CES in Las Vegas, Nvidia announced that its global Drive Hyperion ecosystem is expanding to include a wider set of tier 1 suppliers, automotive integrators and sensor partners such as Aeva, AUMOVIO, Astemo, Arbe, Bosch, Hesai, Magna, Omnivision, Quanta, Sony and ZF Group. The company framed Drive Hyperion as the backbone for the transition to autonomous mobility, with vice president of automotive Ali Kani highlighting that the unified platform brings compute, sensors and safety together so automakers and software developers can bring full autonomy to market more quickly with greater reliability. Nvidia said this integrated ecosystem is meant to give automotive customers confidence that sensing systems and other hardware will be fully compatible with Drive Hyperion, easing integration while cutting development complexity, testing time and costs.
Nvidia detailed a growing sensor and control ecosystem around Drive Hyperion, noting that Astemo, AUMOVIO, Bosch, Magna, Quanta and ZF Group are building electronic control units based on the platform. AUMOVIO, along with Aeva, Arbe, Hesai, Omnivision and Sony, are also qualifying their sensor suites on the open, production‑ready Drive Hyperion architecture, spanning cameras, radar, lidar and ultrasonic technologies to support level 4 autonomy. The company emphasized that centralized compute and sensor fusion enable cross domain control of braking, suspension and steering to deliver synchronized, low latency actuation for advanced automated driving. By building domain controllers or qualifying sensors on Drive Hyperion, partners gain seamless compatibility with Nvidia’s full stack autonomous vehicle compute platform, which is intended to speed development and accelerate time to market.
At the core of the ecosystem is Nvidia Drive Hyperion, a production ready compute and sensor reference architecture designed to make vehicles level 4 ready. Featuring two Nvidia Drive AGX Thor systems on a chip built on the Nvidia Blackwell architecture, Drive Hyperion delivers more than 2,000 FP4 teraflops – or roughly 1,000 INT8 trillion operations per second – of real time compute to fuse a full 360 degree sensor view. Nvidia said this performance supports transformer based perception, vision language action models and generative Artificial Intelligence workloads that can reason about complex driving scenes in real time, allowing partners to differentiate at the software and service layers while relying on a common hardware and safety foundation. Safety and trust are anchored by Nvidia Halos, a safety and cybersecurity framework spanning from the data center to the vehicle, combined with large scale simulation and Artificial Intelligence data factory workflows for continuous testing across millions of virtual and real world scenarios. Nvidia also introduced a new family of Artificial Intelligence models and tools called Alpamayo, optimized for real time performance on Drive Hyperion to make level 4 development more accessible for both passenger and commercial fleets, underscoring the company’s end to end approach from high performance computers and sensor integration to Artificial Intelligence training and simulation.
