Lucid will launch its first car, the Lucid Air, with a complete sensor set for autonomous driving from day one, including camera, radar and lidar sensors. Mobileye was chosen to provide the primary compute platform, full 8-camera surround view processing, sensor fusion software, Road Experience Management (REM™) crowd-based localization capability, and reinforcement learning algorithms for Driving Policy. These technologies will enable a full Advanced Driver Assistance System (ADAS) suite at launch, and then enable a logical and safe transition to autonomous driving functionality through over-the-air software updates.
Mobileye is expected to provide a dual set of EyeQ4 system-on-chips. The chipset will process a full 8-camera surround view system, providing full 360-degree visual perception. Consistent with other Mobileye programs, the camera set includes a forward-facing trifocal-lensed camera and an additional five cameras surrounding the vehicle.
In addition, Mobileye will offer sensor fusion software that incorporates data from radar and lidar sensors, along with the camera set, in order to build the critical environmental model necessary to facilitate autonomous driving.
To complete and strengthen the environmental model, Mobileye’s REM™ system is intended to provide the vehicle with highly accurate localization capability. Lucid vehicles will benefit from the near real-time updating of the collaborative, dynamic global Roadbook™ high-definition mapping system. Data generated from Lucid vehicles can be used to enhance the autonomous driving software and will also contribute to the aggregation of Mobileye’s Global Roadbook.