CES 2026 in Las Vegas is showcasing the latest breakthroughs in mobility, automotive technology, artificial intelligence and autonomous systems, bringing together global innovators, exhibitors and industry leaders to demonstrate how next-generation AI and robotics are reshaping vehicles and transportation ecosystems. There was news announced by Sony Honda, Bosch, Hyundai, NVIDIA,, Lucid, Nuro, Uber, NXP, Teledyne FLIR OEM, MulticoreWare, Qualcomm, TIER IV, STRADVISION, Visteon, Hesai, Nextbase, Mitsubishi Motors, Valens and Sakae Riken.
Because there were so many announcements today, we will follow up tomorrow with any news with missed.
In this Article
Sony Honda Mobility Unveils AFEELA 1 and 2026 Prototype
Sony Honda Mobility (SHM) presented its AFEELA 1 pre-production model and debuted the AFEELA Prototype 2026, outlining a vision of mobility as a “Creative Entertainment Space” powered by AI. AFEELA 1 deliveries in California are planned for late 2026, expanding to Arizona in 2027, while a production model based on the 2026 prototype is expected in the U.S. by 2028. The vehicles feature advanced driver assistance systems (ADAS) evolving toward Level 4, an interactive AFEELA Personal Agent leveraging Microsoft Azure OpenAI for personalized experiences, and in-cabin entertainment co-created with developers through the AFEELA Co-Creation Program. SHM also plans a token-based on-chain mobility service platform to support ideation, development, and evaluation of mobility services. The AFEELA 2026 prototype emphasizes flexible interior space and enhanced user experience, reinforcing SHM’s strategy to merge intelligence, entertainment, and creative engagement in next-generation vehicles.
Bosch @CES 2026
Bosch is leveraging CES 2026 to highlight how its software-hardware integration is driving the future of intelligent mobility, projecting over €6 billion in software and services sales, mostly from its Mobility division, and demonstrating new AI-based cockpit systems that personalize the in-vehicle experience using large language and visual models. The company is also showcasing advanced by-wire systems for autonomous driving and its Radar Gen 7 Premium sensor.
Autonomous trucking startup Kodiak AI announced a strategic partnership with Bosch to develop and scale production of automotive-grade hardware and sensor systems for self-driving trucks, aiming to move beyond pilot projects toward commercial operations in freight, integrating Bosch’s components into a safe, full autonomous platform.
Bosch outlined its optimistic view that the increasing tech content of vehicles will drive demand for software and high-performance computing, forecasting its software, sensor and network component sales more than doubling by the mid-2030s—anchored by AI-powered vehicle platforms and cockpit systems that enhance driving safety and comfort.
Hyundai Altas Robotic Monday @CES
Hyundai Unveils Human-Centered AI Robotics Strategy
Hyundai Motor Group used CES 2026 to launch a broad AI Robotics Strategy under the theme “Partnering Human Progress,” signaling a shift from traditional automotive mobility toward human-robot collaboration; the plan involves co-working robots that assist with hazardous tasks, deep partnerships with Boston Dynamics and global AI leaders, and expanding Physical AI across manufacturing, logistics and mobility.
In a CES first public demonstration, Hyundai and Boston Dynamics showcased the humanoid Atlas robot walking and interacting on stage; this version is destined for industrial use with planned deployment at Hyundai’s Savannah, Georgia EV plant by 2028, highlighting broader ambitions in physical AI and robotics integration in automotive manufacturing.
Hyundai Motor Group’s MobED mobile droid won CES 2026’s Best of Innovation in Robotics, recognizing its autonomous navigation and industrial versatility; commercialization is slated to begin in early 2026, underscoring Hyundai’s progress from concept to practical mobility robotics solution
NVIDIA Family of Autonomous Tech & AI
NVIDIA Unveils Alpamayo Open-Source AI Models
NVIDIA announced the Alpamayo family of open-source AI models, simulation tools and datasets designed to accelerate the development of safer, reasoning-based autonomous vehicles. The suite includes Alpamayo 1, a 10-billion-parameter vision-language-action model that can perceive, reason about complex scenarios and generate driving trajectories with “chain-of-thought” explanations, as well as AlpaSim, a high-fidelity simulation framework, and a large Physical AI open dataset with over 1,700 hours of diverse driving data. By integrating models, tools and real-world edge-case datasets, Alpamayo aims to help developers fine-tune, test and validate advanced autonomous driving systems that can handle rare and unpredictable conditions, fostering collaboration across automakers and researchers to push toward level-4 autonomy.
NVIDIA DRIVE AV Software Debuts in All-New Mercedes-Benz CLA
NVIDIA announced that its full-stack DRIVE AV software is being deployed in the all-new Mercedes-Benz CLA to power enhanced Level 2 point-to-point driver-assistance capabilities in the U.S. by the end of this year, marking a significant expansion of NVIDIA’s AI-defined driving technology into production vehicles. The system, built on NVIDIA’s DRIVE AV stack and Halos safety framework, combines end-to-end AI models with classical safety redundancy to assist in urban navigation, collision avoidance, automated parking and complex scenarios with human-like decision-making. By enabling over-the-air updates and a cloud-to-car development pipeline, the collaboration with Mercedes-Benz underscores NVIDIA’s role in turning cars into “living, learning machines” and accelerating scalable deployment of intelligent mobility across the automotive industry.
NVIDIA Expands DRIVE Hyperion Ecosystem
NVIDIA unveiled a significant expansion of its DRIVE Hyperion ecosystem, bringing in a wider group of Tier-1 suppliers, automotive integrators and sensor partners—including Aeva, BOSCH, Magna, Omnivision, Sony, ZF Group, and others—to build compute, sensors and control units compatible with its production-ready autonomous vehicle platform. The open DRIVE Hyperion architecture, anchored by dual DRIVE AGX Thor chips and safety tools like NVIDIA Halos, provides scalable real-time compute for AI perception, sensor fusion and domain control needed for Level 4 autonomy, while enabling partners to streamline integration, reduce testing time and accelerate time-to-market for autonomous passenger and commercial vehicles. This ecosystem effort reinforces NVIDIA’s end-to-end strategy for building safe, AI-driven mobility solutions from compute to sensors to software.
Lucid, Nuro, and Uber Production-Intent Robotaxi
Lucid, Nuro, and Uber unveiled their production-intent robotaxi for the first time, showcasing the in-cabin rider experience and next-generation autonomous technologies ahead of a planned launch in the San Francisco Bay Area later this year. The vehicle features a 360-degree sensor array with high-resolution cameras, solid-state LiDAR, and radar integrated into the Lucid Gravity body and roof-mounted halo, which uses LEDs to communicate vehicle status and rider identification. Passengers can personalize their ride with interactive controls for climate, seats, music, and support, while visualizations display what the robotaxi sees and its planned maneuvers in real time. The spacious interior accommodates up to six passengers with ample luggage space. Advanced autonomy is powered by NVIDIA DRIVE AGX Thor hardware and Nuro’s Level 4 end-to-end AI system, currently undergoing on-road testing for safety validation. Pending final approval, production is expected at Lucid’s Arizona factory later in 2026.
NXP Unveils S32N7 Processor
NXP introduced its S32N7 super-integration processor series, designed to fully digitalize and centralize core vehicle functions—including propulsion, vehicle dynamics, body, gateway, and safety—into a single, secure computing hub. Built on 5 nm technology, S32N7 simplifies vehicle architectures by consolidating dozens of hardware modules, enabling automakers to reduce total cost of ownership by up to 20% while creating a scalable foundation for AI-driven features such as personalized driving, predictive maintenance, and virtual sensors. Bosch is the first to deploy the S32N7 within its vehicle integration platform, working closely with NXP on reference designs and safety frameworks to speed adoption. With a scalable portfolio of 32 variants and support for high-performance compute, networking, and AI acceleration, the S32N7 positions itself as a central control point for next-generation, software-defined mobility.
Teledyne FLIR OEM Launches Tura ASIL-B Thermal Camera
Teledyne FLIR OEM unveiled Tura™, the first Automotive Safety Integrity Level (ASIL‑B) thermal longwave infrared (LWIR) camera designed for night vision, ADAS, and autonomous vehicles. Featuring a 640×512 far-infrared sensor with industry-leading sensitivity, Tura can detect pedestrians, animals, and other hazards in complete darkness and adverse conditions such as fog, smoke, and glare. Developed to ISO 26262 functional safety standards, the camera enhances pedestrian automatic emergency braking (PAEB) and supports 360-degree situational awareness in fully autonomous vehicles. Designed for reliability and scalability, Tura integrates with AI perception software, heated enclosures, and shutterless components for continuous all-weather operation, providing OEMs with a cost-effective solution for high-performance safety and autonomous vehicle perception.
MulticoreWare Qualcomm
Cloud-to-Car AI Workflow for ADAS Using Qualcomm AI Hub at CES 2026
MulticoreWare demonstrated a real-time “cloud-to-car” AI development and validation workflow that enables automakers and Tier-1 suppliers to accelerate deployment of advanced driver-assistance systems. Using Qualcomm AI Hub, AIMET, and Qualcomm’s Cloud AI 100 (QCR100) accelerator, the company showed how complex ADAS perception models can be optimized, quantized from FP32 to INT8, and validated in the cloud with automotive-grade performance before being deployed to edge automotive platforms. The demonstration highlighted faster development cycles for software-defined vehicles, seamless compatibility between cloud and in-vehicle AI hardware, and scalable global validation without on-premises infrastructure, underscoring how Qualcomm and MulticoreWare’s collaboration streamlines AI model optimization and speeds time-to-market.
TIER IV Level 4+ Autonomous Driving
At CES 2026, TIER IV, a leader in open-source autonomous driving software, will present its latest end-to-end (E2E) AI architectures for Level 4+ autonomy, highlighting modular and monolithic approaches that expand operational design domains for real-world, complex scenarios. Visitors can experience a development vehicle equipped with E2E AI, live simulations, and SOAFEE-enabled software-defined vehicle platforms integrating cloud-native development with edge deployment. Demonstrations will also showcase the enhanced Autoware Open AD Kit, Pilot.Auto applications for off-road use, and neural network models—AutoSpeed and AutoSteer—delivering advanced ADAS evolving toward Level 4+ capabilities. Collaborations with partners including Arm, Excelfore, Corellium, and AWS reinforce TIER IV’s commitment to accelerating open-source, AI-driven autonomous vehicle technologies.
LG Innotek Advanced Autonomous Driving and EV Solutions
LG Innotek unveiled a comprehensive suite of future mobility solutions, featuring an autonomous vehicle mock-up with 16 AD/ADAS products, high-performance LiDAR, heated and self-cleaning camera modules, and integrated sensing technologies for enhanced safety and perception. Visitors experienced in-cabin innovations including under-display cameras with AI image restoration, UWB radar for child presence detection, kick sensors, and advanced vehicle lighting solutions such as ultra-thin pixel modules and Nexlide Air. The exhibition also highlighted EV technologies, including the world’s first 800V wireless Battery Management System (BMS), B-Link integrated battery units, precision motor control, and miniaturized components for lightweight, efficient architectures. CEO Moon Hyuksoo emphasized that CES 2026 demonstrates LG Innotek’s evolution into a total mobility innovation company, combining AI, autonomous driving, EV, and connectivity technologies to redefine vehicle experiences.
STRADVISION SVNet Front, Surround Vision on TI Gen 5 Automotive
At CES 2026, STRADVISION will demonstrate its SVNet FrontVision and SurroundVision software on Texas Instruments’ Gen 5 automotive platforms, including the TDA4VH and TDA5 families. The demo highlights front-view and surround-view camera perception scenarios commonly used in production ADAS systems, illustrating how workloads scale from current driver-assistance functions to next-generation automotive compute architectures. Presented in video format at the TI booth, the showcase visualizes perception outputs and provides OEMs and Tier‑1 suppliers with practical references for system integration and performance evaluation. The demonstration underscores STRADVISION’s continued collaboration with TI and its focus on scalable, production-ready vision perception solutions for diverse vehicle architectures and cost targets.
Visteon Showcase
Visteon unveiled its most immersive and production-ready showcase to date, featuring AI-enabled cockpit electronics, advanced displays, connectivity solutions, and electrification platforms. Highlights include the SmartCore™ HPC high-performance cockpit controller supporting multi-display experiences and up to 14 cameras, and the upgraded cognitoAI™ ADAS Compute Module for plug-and-play AI in both driver-assistance and in-cabin applications. The Entry Cockpit platform brings advanced digital experiences—including Android Auto and Apple CarPlay on sub-seven-inch displays—to emerging vehicle segments, while Visteon’s display portfolio spans triple-screen, pillar-to-pillar, and next-generation HUD technologies. Connectivity offerings include in-house 5G modules with OTA updates and remote diagnostics, and electrification solutions cover 48V to 800V architectures with GaN-based chargers, DC-DC converters, and AI-powered ePowertrain zonal controllers. Visteon emphasizes platform-based solutions powered by a broad ecosystem of technology partners, enabling scalable, flexible, and software-upgradable vehicle architectures across global markets.
Hesai Next-Gen Automotive and Robotics Solutions
Hesai Technology announced it will double annual lidar production to over 4 million units in 2026 to meet growing demand for ADAS, autonomous vehicles, and robotics. The company highlighted its next-generation L3 automotive lidar suite, including long-range ETX and short-range FTX models optimized for vehicle integration, supporting enhanced safety and blind-spot detection. Hesai’s lidar powers autonomous fleets for robotaxi and robotruck programs with some vehicles using up to eight units, while its compact JT series enables robotics and industrial applications such as AGVs, humanoid robots, and 3D digitalization devices. Backed by robust in-house manufacturing, four generations of proprietary ASICs, and a new Thailand facility set to open in 2027, Hesai emphasizes its role as a key enabler of AI-driven mobility and robotics, with design wins from 24 global automotive OEMs and widespread adoption in EVs and industrial platforms.
Nextbase: Vehicle Accessory-as-a-Service Platform & Mitsubishi Partnership
At CES 2026, Nextbase introduced its Vehicle Accessory as a Service (VAaaS) platform and announced a new partnership with Mitsubishi Motors North America, enabling buyers to add Nextbase dash cams directly through the dealership. The VAaaS platform provides a turnkey, modular solution for automakers, covering product selection, certification, installation, integration, marketing, and after-sales support. Mitsubishi customers will receive fully installed, warrantied dash cams and gain access to Nextbase’s Protection platform, creating recurring revenue opportunities for dealers. With over 5.5 million dash cams sold worldwide and experience with OEMs like Toyota, Kia, and Volkswagen, Nextbase positions VAaaS as a flexible, scalable solution to meet growing in-vehicle safety and connectivity demands in the North American market.
Valens and Sakae Riken MIPI A-PHY E-Mirror at CES 2026
Summary: At CES 2026, Valens Semiconductor and Sakae Riken Kogyo introduced the automotive market’s first production-ready e-mirror using MIPI A-PHY connectivity and the Valens VA7000 chipset. The high-resolution, 60fps system delivers significantly more imaging data than existing camera monitoring solutions, enabling more precise ADAS and autonomous driving performance. The product showcases the growing adoption of MIPI A-PHY in Japan and globally, providing high-bandwidth, low-interference connectivity for next-generation in-vehicle imaging. CES attendees will see the e-mirror in a closed demonstration room, alongside other A-PHY-enabled innovations.