At the 2026 GTC conference held in San Jose, California, on Monday (March 16), NVIDIA announced a collaboration with leading automakers such as BYD, Geely, Isuzu, and Nissan to develop Level 4 autonomous vehicles based on the NVIDIA DRIVE Hyperion platform.
According to a blog post cited by IT Home, BYD, Geely, and Nissan are developing next-generation Level 4 autonomous driving projects based on NVIDIA’s DRIVE Hyperion mass-production-grade computing and sensor architecture. Meanwhile, Isuzu and TIER IV are also developing Level 4 autonomous buses using the DRIVE AGX Thor chip, a core component of this platform.
In the mobility services sector, NVIDIA is driving the large-scale deployment of software-defined autonomous taxis (Robotaxi). NVIDIA and Uber announced an expanded collaboration, planning to launch a fleet of fully autonomous vehicles powered by NVIDIA DRIVE AV software in 28 cities across four continents by 2028.
The program will launch first in Los Angeles and the San Francisco Bay Area in the first half of 2027. In addition, global mobility giants such as Bolt, Grab, and Lyft are also leveraging the DRIVE Hyperion platform to accelerate their autonomous driving business.
In terms of security, NVIDIA has introduced the Halos OS unified security architecture to ensure the absolute security of Level 4 systems. This system is built on DriveOS, which has obtained the highest functional safety certification (ASIL D), and adopts a three-layer architecture that integrates security middleware and deployable applications, and incorporates an active security stack that meets NCAP five-star standards.
This architecture provides automotive-grade reliability for inference-based AI systems. Several ecosystem partners, including Quanta, Hesai, and Valeo, have joined the NVIDIA Halos AI System Testing Lab to jointly maintain safety standards for autonomous driving.
At the software algorithm level, NVIDIA released a major upgrade to its open-source model, Alpamayo 1.5. This model can receive driving videos, navigation instructions, and even natural language prompts, and then output a driving trajectory with reasoning logic.
This interactive guidance allows developers to set driving constraints directly via text. When the vehicle encounters unusual road conditions or complex human behavior, the model can help the vehicle learn more efficiently and make safer decisions through scene replay and instruction updates.
Testing and validation are the final hurdle for the deployment of Level 4 autonomous driving. To address this, NVIDIA has launched Omniverse NuRec, based on 3D Gaussian sputtering technology. This technology can directly import real-world data to quickly reconstruct and render interactive 3D simulation scenes.
This allows developers to perform edge case stress tests on autonomous driving systems without spending significant time and resources on manual modeling. Institutions such as Porsche Research Center and the University of Michigan’s Mcity test track have already integrated NuRec into their R&D workflows.
