In this Article
Applied EV Digital Controls
The leader in fully-autonomous Software Defined Machines (SDM), Applied EV, today unveils upgraded technology-stack, specifically designed for the mass-production of fully autonomous vehicles, at the Consumer Electronics Show (CES) in Las Vegas.
Designed for the industry at large, the Generation Six Digital Backbone brings new levels of safety with multiple layers of redundancy, making it an ideal solution for Level Four and Five autonomous applications.
In a world-first, the proprietary Digital Backbone™ is commercially available to OEMs to enable machines of all types to run entirely on software.
The latest launch is part of Applied EV’s expansion plans to improve global transport solutions and deploy commercially-viable Mobility as a Service (MaaS) solutions across the globe.
Originally designed and manufactured in Melbourne, Australia, the Blanc Robot™ is a cabinless universal ‘turn-key’ solution.
In addition to multiple worldwide trials with companies such as The University of Sydney, Australian Centre for Field Robotics (ACFR), Teijin and Ibeo; the Blanc Robot is already being utilized for industrial energy and delivery fulfillment services.
In May 2022, in partnership with Oxbotica, Applied EV conducted the first cabinless autonomous vehicle tested on publicly accessible streets in Oxford, Europe.
In September 2022, Applied EV announced a strategic investment from Suzuki Motor Corporation. The companies are currently exploring the deployment of Applied EV systems in future Suzuki EV products and the production of autonomous vehicles at scale.
The Digital Backbone is set to improve Suzuki’s vehicle’s feature set and accelerate the transition to autonomous transport – while simultaneously – reducing hardware complexity and providing cost efficiencies.
AEVA 4D LiDAR
Aeva® (NYSE: AEVA), a leader in next-generation sensing and perception systems, and Plus, a global provider of autonomous driving solutions, t announced the unveiling of a design for the next generation PlusDrive highly automated driving solution integrated with Aeva’s Aeries™ II 4D LiDAR™ sensor.
“Aeva’s 4D LiDAR provides Plus’s state-of-the-art long range perception with valuable instant velocity detection which will help expand the capabilities of our highly automated trucks,” said Tim Daly, Chief Architect and Co-founder of Plus. “We look forward to furthering our partnership with Aeva and demonstrating our latest vehicle design equipped with Aeries II at CES as we ramp up our global deployment of highly automated trucks.”
Aeva and Plus have been collaborating since 2019 to equip and validate Plus’s autonomous trucking solutions with Aeva’s Frequency Modulated Continuous Wave (FMCW) 4D LiDAR. Today’s announcement builds on the partnership that was announced in 2021 where Plus will use Aeva’s 4D LiDAR sensors to augment the long-range perception system in Plus’s automated driving products.
Because heavy duty trucks take much longer to stop than passenger cars, they need to detect safety-critical objects, place them in lanes, and assign an accurate velocity at very long ranges. Aeva’s high performance 4D LiDAR senses precise velocity and position for each point which helps Plus trucks sense their environment clearly at long ranges, shorten response time in safety-critical situations, and address edge cases such as objects that traditional 3D LiDAR and other sensors on the vehicle may miss.
Cyngn Partners with Ouster
Ouster, Inc. (NYSE: OUST) (“Ouster”), a leading provider of high-resolution digital lidar sensors, announced today that it has signed a strategic customer agreement with Cyngn (Nasdaq: CYN), a developer of innovative autonomous driving software solutions for industrial and commercial applications. The agreement will add Ouster’s new REV7 digital lidar sensors to the Cyngn DriveMod platform that delivers autonomous solutions for both existing and new material handling vehicles in 2023.
Ouster showcased Cyngn’s DriveMod platform, outfit with a REV7 sensor, on a Columbia Stockchaser cargo vehicle at CES 2023 i.
Cyngn develops and deploys scalable, differentiated autonomous vehicle technology for industrial organizations. Cyngn’s self-driving solutions allow existing vehicle fleets to drive themselves. Together, Ouster and Cyngn aim to provide autonomous solutions to address significant challenges common to many industrial organizations such as labor shortages, costly safety incidents, and increased consumer demand for e-commerce requiring more automation.
SOSLAB 3D Solid-State LiDAR for Light
SOSLAB (CEO: Ji Seong Jeong) announced the release of its next-generation 3D solid-state LiDAR ‘ML-X’ for automotive lamps at the CES 2023.
SOSLAB recently unveiled the ML-X, its new product that minimized size while enhancing range performance and resolution by over double compared to its previous version. ML-X tripled angular resolution (0.5° to 0.208° @ FOV 120°), and is drastically reduced in its overall size and weight to 9.5×5.0x10.2 (cm3) and 860g, respectively, by adopting an exclusive SOC (system on chip) for laser control in the transmission unit. In addition, this product is expected to maximize the convenience of users as it does not require an additional external module for the operation of LiDAR.
SOSLAB will showcase exuberant LiDAR data to the audience by conducting an on-the-spot live demonstration after installing ML-X in front and rear lamps of a vehicle at its booth in CES 2023. The company is currently collaborating with SL Corporation, a tier 1 global parts supplier to Hyundai-Kia Motors, for the project of installing ‘ML-X’ in rear lamps. Under a partnership with Metrolla, a LiDAR solution company in the United States, SOSLAB is preparing a ‘People Counting’ solution that could scan the space inside the booths. As it is a solution that can be used for analyzing the floating population and extent of congestion in certain spaces, the company is seeking to commercialize the technology with Metrolla to be adopted in department stores, airports and train stations in South Korea and other countries.
The company will also introduce LiDAR GL for robots that it began commercial production this year. Being a LiDAR sensor that is used for high-speed overhead hoist transports (OHTs) and robots, GL is a LiDAR for robots that deliver high angular resolution and fast scanning speed. In addition to robots for the OHT market, SOSLAB will expand into markets of automated guided vehicles (AGVs) and autonomous mobile robots (AMRs) in the future.
LG & Magna Partner for Automated Drive Infotainment
LG Electronics (LG) announced a technical collaboration with Magna, a global mobility technology company and one of the largest suppliers in the automotive space. The two companies have signed an agreement to develop a proof of concept for an automated driving-infotainment solution, aimed at providing differentiated customer experiences and enabling readiness for the future of mobility.
Under the agreement, LG and Magna will explore the technical feasibility of integrating LG’s infotainment capabilities with Magna’s Advanced Driver Assistance System (ADAS) and automated driving technologies.
LG Vehicle component Solutions (VS) Company is recognized as a leader of In-Vehicle Infotainment (IVI) capabilities in the automobile industry. In anticipation for the car of the future, LG VS Company has been proactively exploring its portfolio of future products and technologies.
This concept will focus on creating executable IVI-ADAS solutions to better meet carmakers’ vehicle programs. Both LG and Magna plan to introduce this concept to global automakers during CES 2023.
“We are excited to work with Magna, now in the automated driving sector, to develop a proof of concept that could bring value expanding beyond our cockpit domain,” said Eun Seok-hyun, president of LG VS Company. “We plan to work closely together to demonstrate potential ADAS innovations that could help automakers address some of their toughest challenges.”
This collaboration follows a successful joint venture LG and Magna announced in July 2021 called LG Magna e-Powertrain Co., Ltd, to manufacture e-motor, inverters and on board chargers as well as related e-drive systems to support the growing global shift toward vehicle for certain automakers.///
TIER IV & Baraja
TIER IV, an open-source autonomous driving leader, is expanding upon its work with Baraja, creator of the breakthrough Spectrum-Scan™ LiDAR technology, by signing a memorandum of understanding (MOU) to jointly validate, develop and optimize their respective autonomous driving solutions.
Following a successful research and development project between the companies, originally announced in January 2022, this technical collaboration intends to integrate Baraja’s next-generation, high-performing LiDAR with TIER IV’s HDR cameras and sensor fusion software, which utilizes point cloud perception. This advanced combination will offer customers an industry-leading autonomous vehicle solution with planned availability in 2025.
Baraja’s revolutionary LiDAR, the Spectrum HD25, is a key ingredient to the MOU. The partnership will boost the capabilities of both company’s technologies, delivering unique benefits to the automotive industry, including:
- Greatly enhanced localization and scene comprehension
- Optimized use of Baraja’s LiDAR in different environments (e.g. rain, fog, etc.),
- The use of Doppler to solve critical edge cases
- Dependable long-range detection and more
Ultimately, the resulting solution will provide automotive perception algorithms with a clear and accurate representation of the physical world to help enable autonomous vehicles to make swift, accurate decisions based on the immediate situation.
Spectrum HD25 was created using Baraja’s robust and proprietary Spectrum-Scan™ solid-state scanning platform and is a significant leap forward in LiDAR technology. Designed to completely reimagine how cars see the world around them, this LiDAR system is the first in the world to combine per-point Doppler capability at the hardware level with a tunable wavelength laser and Random Modulation Continuous Wave (RMCW) ranging method. Its unique makeup enables the optimal resolution and range required for safe deployment of autonomous functions and allows it to deliver unmatched accuracy and performance.
TIER IV is developing sensor fusion and perception algorithms for the next generation of autonomous vehicles, built on the open-source platform Autoware. Research and development into new sensing methods and strategies is integral to expanding operational design domains through reference designs, and will contribute to growing the Autoware ecosystem. Accelerated development of high performance perception systems will be made available to all Autoware users by incorporating cutting-edge technology such as Baraja’s Spectrum Scan LiDAR into TIER IV’s Pilot.Auto and Edge Perception Development Kit.
“We have made a number of tangible and impactful achievements through the collaborations with Baraja. TIER IV aspires to accelerate its business on the Autonomous Driving Development Kit (ADK) and passionately contribute to the Autoware community by providing competitive reference designs and solutions.” said Shinpei Kato, Founder and CTO of TIER IV, Co-Founder and Chairman of the Board of Directors of the Autoware Foundation.
Ushr Maps for Nissan ProPILOT
Ushr Inc. announced it is supplying its high-definition map data (“HD map”) to Nissan’s ProPILOT Assist 2.0* advanced driver assistance system for the new 2023 Nissan Ariya all-electric crossover.
Nissan’s ProPILOT Assist 2.0 allows drivers to remove their hands from the steering wheel under certain conditions. It has been available in Japan and is making its United States debut on the Ariya, which began arriving in dealerships in late 2022.
“We have been collaborating with Nissan for a long time, and we are excited to grow our relationship with Ushr to help enable Nissan’s ProPILOT Assist utilizing the Mitsubishi Electric High Definition Locator Module,” said Mark Rakoski, vice president, Advanced Engineering, Mitsubishi Electric Automotive America.
Ushr worked with Mitsubishi Electric to integrate its data into ProPILOT Assist 2.0. Ushr also provides its HD map of United States motorways to Mitsubishi Electric, which leverages the data in its High-Definition Location Module (HDLM). Ushr’s precise HD map and Mitsubishi Electric’s robust HDLM solution allow Nissan to deliver an exceptionally confident and accurate hands-off driving experience to its customers.
Ushr’s HD map data fused with Mitsubishi Electric’s HDLM identifies curves in roads much sooner than vision or radar sensors, allowing the system to anticipate turns and comfortably adjust speed. The data allows a vehicle to detect its location within centimeters to provide smoother driving and peace of mind.
Ushr’s HD map data supports ProPILOT Assist 2.0’s Intelligent Cruise Control, Speed Limit Assist, Speed Adjust by Route, Steering Assist, Lane Change Assist, and Route Assist features.
“We are proud and excited to work with a tech-forward company like Nissan,” said Chris Thibodeau, chief executive officer of Ushr. “Our goal is to offer automakers precise map data that makes drivers feel comfortable using this technology.”
New Continental Sensor Suite
Continental is presenting an innovative sensor solution for commercial vehicles at CES 2023.The commercial vehicle business is more complex than ever due to growing transport volumes, equipment rates of trucks with new assistance systems, and increasingly complex fleet management. The Continental Sensor Array provides an answer to the growing number of intelligent and automated driving systems in commercial vehicles. The multi-sensor system can be mounted above modern vehicle windshields to provide a compact, integrated solution to manage the new complex environments of commercial vehicles.
Additionally, higher levels of automation, such as L4, require a larger number of different sensors that can be installed intelligently, quickly, and safely. In Continental’s comprehensive Sensor Array solution, all integrated sensors – lidar, radar, cameras – are pre-calibrated and coordinated with each other. This supports adaptive cruise control, emergency braking, blind spot assist, and automated driving functions. The coordination significantly simplifies the installation of many sensors, their integration into the vehicle architecture and the complex process of calibration. In addition, the effort for maintenance and the downtimes of vehicles are reduced. Continental is thus helping to make commercial vehicles and logistics companies fit for the mobility of tomorrow.
Compact, modular multi-sensor solution for safe driving with commercial vehicles
The market for automated driving is growing substantially and commercial vehicles play a key role in this. Especially on long distances between logistics centers – from “hub to hub” – innovative assistance systems are already contributing to significantly increased safety. The solutions will continue to revolutionize the transport business, right up to autonomous trucks on the motorways. Sensors, software, and intelligent connectivity concepts are the basis for this. As a leading system expert in radar, lidar and camera solutions for assisted and automated driving, Continental sees the intelligent combination of these different technologies as a decisive added value for safety, comfort, and functional availability on the road to achieve Vision Zero.
“The Continental Sensor Array offers a tailor-made system solution approach for commercial vehicles,” said Vinh Tran, head of the Autonomous Mobility Business Area, Continental North America. “The potential for automated and autonomous driving in the commercial vehicle sector will change the industry, especially when it comes to managing a high number of downtimes. Highly complex sensor systems are required to take advantage of that potential. By offering easy assembly and calibration of multiple sensor systems in a single, compact module, our solution will help prepare companies for the future of commercial mobility.”
Development, calibration, sensor care, updates: a one-stop service
With the new multi-sensor system, Continental provides all safety-relevant sensor technology from a single source: sensors, central computing units for control, calibration of the individual sensors among each other, and calibration of the overall solution when mounted on the vehicle.
“A multi-sensor system in a compact solution offers vehicle manufacturers and users many advantages,” added Tran. “Radar, lidar and cameras are pre-calibrated and matched to each other. This significantly reduces efforts around sensor assembly, sensor calibration, and vehicle maintenance. And if, for example, trucks drive autonomously and are on the road around the clock, our Sensor Array can be easily replaced for maintenance work. That means no more costly downtimes.”
In addition, the electronic architecture and wiring harnesses in commercial vehicles are significantly streamlined – and later updates can be implemented quickly. Sensors for new assistance systems can be easily integrated into the compact solution. The overall sensor system does not require any fundamental modifications to the driver’s cab or vehicle body for future generations of trucks. It can also be mounted on existing generations of modern trucks, provided that the onboard electrical system is prepared for this.
Cleaning and air conditioning of the sensor units are also simplified with Continental’s automatic “Camera and Sensor Cleaning, Cooling and Heating” system. All sensors independently monitor the degree of their pollution. Camera lenses are automatically cleaned by a water jet. In addition, intelligent thermal management ensures the flawless use of sensor technology in all weather conditions.
Camera, lidar, radar – multi-sensor system for safe mobility
The combination of different sensor systems and their redundancy is crucial for the reliable use of driver assistance systems and autonomous driving. All sensors must be calibrated individually and coordinated with each other in combination. Continental sees the joint use of the three sensor systems – camera, lidar and radar – as the ideal solution for reliable recognition of objects and holistic detection of the vehicle’s surroundings. The technology company has more than 25 years of experience in the development and integration of individually tailored, safe and robust sensor solutions from individual components to complete systems. To date, Continental has brought more than 150 million sensors for assisted and automated driving functions on the road.
Continental Partners with Ambarella
-At CES 2023, technology company Continental (XETRA: CON) and Ambarella, Inc. (NASDAQ: AMBA), an edge AI semiconductor company, today announced a strategic partnership. The two companies will jointly develop scalable, end-to-end hardware and software solutions based on artificial intelligence (AI), for assisted and automated driving (AD), on the way to autonomous mobility. The strategic collaboration builds on Continental’s announcement in November to integrate Ambarella’s energy-efficient System-on-Chip (SoC) family into its Advanced Driver Assistance Systems (ADAS). Compared to other domain controller SoCs, Ambarella’s “CV3-AD” chip family provides higher performance to process sensor data faster and more comprehensively for greater environmental perception and safer mobility, at up to five times higher power efficiency. The partners are combining Continental’s software and hardware expertise and broad portfolio of automotive system solutions with Ambarella’s computer vision know-how, powerful SoCs and software modules. In addition to the development of camera-based perception solutions for ADAS, Continental and Ambarella are focusing on scalable full-stack systems for Level 2+ up to highly automated vehicles. These full-stack solutions take a multi-sensor approach, including Continental’s high-resolution cameras, radars and lidars, as well as the associated control units and the required software. Vehicle manufacturers will be able to flexibly integrate the joint system solutions into their latest vehicle generations. In electric vehicles, the energy-efficient solutions reduce power consumption and cooling demands, contributing to a lower battery weight of several kilograms (estimated 6 pounds). This results in an increased average range of about 5-10 kilometers (3-6 miles) with the same battery capacity, based on a typical configuration. To serve the growing market for assisted and automated driving, while paving the way for autonomous mobility, the partners aim to have these joint solutions ready for global series production in 2026.
Joint system solutions enhanced by artificial intelligence
Continental’s portfolio, extended with Ambarella’s CV3-AD System-on-Chip family, will draw upon its proven expertise in sensor technologies, cross- and domain-specific high-performance computing systems, software development and deep experience in implementing automotive functions in this growth market of assisted and automated driving. At the same time, Continental will tap into its large software and development ecosystem to offer additional upward scalability in its already broad ADAS full-stack offerings. These systems, based on Ambarella’s CVflow SoCs with AI, provide vehicle manufacturers with a flexible platform to scale their investment costs across all vehicle types.
Continental contributes the hardware and large parts of the software to this partnership, while Ambarella provides the SoC platform and further software functionalities. As a result of this strategic partnership, the next generation of vehicles, ranging from L2+ to the highest automation levels, will be able to utilize the powerful, energy-efficient, and scalable mobility system solutions from Continental and Ambarella.
Ambarella’s CV3-AD AI domain controller SoC family enables centralized, single-chip processing for multi-sensor perception—including high-resolution camera, radar, ultrasonic sensors and lidar—as well as deep fusion of these sensors and autonomous vehicle path planning. These fully scalable, power-efficient SoCs provide industry-leading AI performance per watt for neural network computation, with up to 40x better performance than Ambarella’s CV2 automotive SoC family. Additionally, Ambarella integrates its superior image signal processor technology into all of its SoCs. The result is robust ADAS and L2+ 4 to Level automated driving systems with greater levels of environmental perception in challenging lighting, weather and driving conditions for human vision and edge AI applications.
Luminar Production Wins
Luminar (Nasdaq: LAZR), a leading global automotive technology company, announced more commercial momentum at the end of 2022 than anticipated with additional production wins for multiple consumer vehicle models with leading automakers. At CES this week, Luminar will host the North American debut of the Volvo EX90, as well as the SAIC Rising Auto R7, and unveil its new software-based mapping product developed last year following the acquisition of Civil Maps. Luminar will also unveil an updated brand and vision for consumer-facing audiences in preparation for its launch on additional consumer vehicles.
New Major Commercial Wins Amidst Strong Execution
Luminar secured new major commercial program wins for multiple consumer vehicle models with automotive OEMs in the fourth quarter to bring its technology to significantly more cars globally starting in 2025. The commercial success exceeded Luminar’s goal of 60 percent year-over-year growth in total program wins and exceeded its target of 60 percent year-over-year forward-looking orderbook growth, both major milestones for the company in 2022. This caps off the remaining two of Luminar’s 4 key company level 2022 milestones, with the other two already achieved by Luminar including start of production (SOP) and the beta milestone for its Sentinel software suite, which is being demonstrated to the public at CES. Further details on the milestones and wins will be provided at Luminar Day on February 28.//
VEN.AI for Automated Parking
NTT DATA, a global digital business and IT services leader, Valeo, and Embotech, a software scale-up for autonomous driving systems, announce the next big step to provide automated parking solutions. Together, the three companies have built a consortium, VEN.AI, that aims to be the go-to solution provider for production ready parking automation with global roll out capabilities. The consortium combines each company’s core competencies including owned IP, the latest technology, sales and support structure and strong operations offerings.
VEN.AI offers an infrastructure-based solution that has very few requirements from the vehicle side as it guides vehicles via the use of sensors, connectivity (e.g., 5G) and offboard computing to a dedicated parking spot. Automated parking solutions can be implemented in a variety of use cases including the assembly plants where vehicles are produced, outbound-logistics distribution parks, depots for vehicle fleets operators, retail outlets as well as in parking garages as an automated valet service.
In the initial phase, VEN.AI is focusing on manufacturing-specific use cases to help car manufacturers increase the efficiency of their assembly lines by automatically guiding vehicles from one production station to another. From here, it will then drive the vehicle from the end of the assembly line to its dedicated spot on the large launch areas. The outcome is a more efficient assembly line, saving on time and costs.
As the world leader in ADAS sensors and related detection algorithms, Valeo is responsible for supplying the necessary technology to make the parking system function properly. The sensors supplied are used to detect and locate cars in the parking area, as well as to understand the surrounding environment. This information is then used by the algorithms to accurately guide the car to an available parking space.
Safety is a key component of VEN.AI by integrating the latest technology innovations including the use of cameras, light detection, low-latency connectivity and sensors. The system has been designed to fulfill the highest requirements and standards on safety and availability and has the best-in-class building components including Embotech PRODRIVER®. This SAE L4 virtual driver is designed to provide fully flexible, human-like driving, which is a key enabler of automated driving in busy logistics areas with mixed traffic as well as parking into very tight spaces, increasing the space efficiency.
Additionally, VEN.AI is integrating the parking automation solution within global production systems, adding automated charging for electric vehicles, and using internal onboard sensors to improve efficiency and flexibility in large parking areas. The global roll out also includes 24/7 operations support.
Mobileye Eyes Future & Radars
Over the past few years, Mobileye has been developing a new technology to help autonomous vehicles sense and understand their environment – regardless of weather, lighting or road types – in addition to the company’s renowned camera-based perception systems. Known as software-defined imaging radar, or 4D radar, the technology will play a key role in bringing autonomous vehicles and the most advanced forms of driver-assistance technology to life.
Mobileye announced a collaboration with Wistron NeWeb Corp. (WNC) for production of its software-defined imaging radars. WNC, based in Taiwan, works as a major electronics and radar supplier for automakers worldwide. This collaboration is expected to allow Mobileye and WNC to begin producing automotive-grade imaging radars two years from now, with strong initial interest in the technology from key automaker customers.
“The imaging radars we have been developing over the past few years are uniquely designed to be an essential enabler of high autonomy levels in future vehicles, by delivering rich and reliable radar output, upgrading perception-by-radar capabilities, and reducing the need for multiple lidar sensors,” said Yaniv Avital, Mobileye’s Radar Vice President and General Manager. “WNC’s experience and accomplishments as an automotive supplier can help us bring this much-needed innovation to the market by our original targeted timeline and at the expected quality.”
The imaging radar developed by Mobileye goes far beyond the simple devices on vehicles today. Radars emit radio frequency signals to detect obstacles, and just like cameras, the more data they can process, the more details they can spot. When paired with advanced cameras, radars can provide sensing at longer distances and in certain weather or lighting conditions that can even challenge camera-based imaging.
Mobileye’s imaging radars use advanced radar architecture including Massive MIMO (multiple-input, multiple-output) antenna design, a high-end radio frequency design developed in-house, and high-fidelity sampling – all enabling accurate object detection and wider dynamic range. Thanks to an integrated system-on-chip design that maximizes processor efficiency, and world-leading algorithms for interpreting radar data, Mobileye’s imaging radars deliver a detailed, four-dimensional image of surroundings up to 1,000 feet away and beyond. With a 140-degree field-of-view at medium range and 170-degree field of view in close range, the radar enables more accurate detection of pedestrians, vehicles or obstructions that other sensors might miss – even on crowded urban streets.
“The new imaging radar technology is a key focus for future high-level autonomous driving,” said Repus Hsiung, Vice President & General Manager of the Automotive & Industrial Solutions BG at WNC. “We are delighted to collaborate with Mobileye to accelerate the availability of advanced imaging radars in the market. Leveraging our expertise in automotive electronics and radar solutions, we look forward to working with Mobileye to further develop exciting new capabilities.”
Mobileye’s True Redundancy™ approach to autonomous vehicles envisions using imaging radars to create a 360-degree sensing system that operates in addition to, but independently from, a camera-based system. By having multiple systems that are each capable of navigating a vehicle alone, the “eyes off” autonomous system can deliver reliable rides with low chance of failure and simplified safety validation. Imaging radar can also play a role in more advanced hands-free ADAS solutions as an alternative to LiDAR solutions, which are typically far more expensive.
Solid-State Hesai LiDAR
Hesai, a global leader in lidar technology, debuted its new fully solid-state lidar FT120 at CES 2023. The FT120 debuts along with a series of automotive lidars to showcase the exciting progress of OEM and autonomous mobility partnerships in the global market.
As ADAS and autonomous-driving technology advances, the sensor requirements and system architecture must evolve to address the needs of the industry. To accommodate the challenges that autonomous vehicles face within their perception system, Hesai has developed a fully solid-state lidar, FT120, for near-range blind spot coverage. The FT120 boasts an impressive 100° x 75° ultra-wide field of view (FOV). Its maximum detection range is 100 meters. Data rate per second is 192,000 points (in single return mode) and overall resolution is 160 (H) x 120 (V).
Designed as a blind spot detection sensor for ADAS, the FT120 helps vehicles accurately identify small objects while turning, passing, parking, and improves overall driving safety in even the toughest scenarios. Together, with Hesai’s long-range hybrid solid-state lidar AT128, the two sensors form a complete automotive grade lidar perception solution.
2022 was a high growth year for Hesai. The company achieved a record milestone of 100,000 lifetime lidar units delivered last month in December, and the partner ecosystem expanded to distinct companies in the United States such as NVIDIA, Zoox, and Nuro. As for 2023, Hesai continues to strategically penetrate the market globally and their new intelligent manufacturing center will begin operation this summer. The addition of the Maxwell manufacturing center, as it will be named, will enhance mass production and delivery capabilities to over a million units per year.
“We have received pre-orders of FT120 with one million units from top automotive OEMs, and will begin deliveries in the second half of 2023”, says Bob in den Bosch, Senior VP of Global Sales of Hesai.
INFINIQ Sensor Fusion
INFINIQ demonstrated its sensor fusion annotation technology at the CES 2023 with lidar and camera sensors.
INFINIQ is a leading AI data service provider that specializes in autonomous driving. It has an end-to-end data platform called ‘DataStudio’ which includes data collection, data anonymization, and data annotation throughout the whole lifecycle of data processing to improve the perception level of autonomous driving.
INFINIQ provides a highly accurate (99.7%), cost-effective, fast, AI-powered automated video & image labeling service that can increase productivity by up to 2 times cutting costs for an AI project.
At the INFINIQ booth of CES, visitors were able to see themselves labelled in real-time. Human behavior was detected and identified using its AI model. This is very crucial for an autonomous vehicle to identify the behavior of people on the street to prevent accidents.
Visitors experienced AI engine built-in anonymization kiosk in real-time. INFINIQ’s anonymization solution Wellid can automatically detect and anonymize all identifiable faces and license plates in videos and images by blurring or deepfake at a high speed. With over 99.9% accuracy, it can protect personal information in the vast amounts of data collected for autonomous driving, complying with the privacy laws such as GDPR and CCPA.
Quectel Wireless Solutions, a leading global IoT solutions provider, t demonstrated an advanced automotive safety integrity levels (ASIL) solution for the automotive market.
The industry-leading positioning solution will provide optimal precision, availability, and reliability for maintaining absolute in-lane positioning, satisfying level ASIL B and appropriate for advanced driver assistance systems (ADAS) and autonomous driving (AD) systems. The design utilizes Quectel’s LG69T-AB automotive module and will be compatible with the Trimble software positioning engine, Trimble RTX correction service, the ST Micro ASIL-rated TeseoAPP GNSS chipset, and the Murata SCHA600 ASIL inertial measurement unit (IMU).
“We’re excited to demonstrate this advanced ASIL solution. The interoperability of Quectel’s leading hardware with Trimble’s industry-leading software and service enables us to deliver an advanced positioning solution that will ensure optimal precision, availability, and reliability for all automotive OEM and tier-one manufacturers,” added Mark Murray, Vice President Sales – GNSS and Automotive, Quectel Wireless Solutions.
Reducing time to market and saving cost, the integrated positioning solution will remove risk for OEMs and tier one automotive manufacturers, satisfying the requirements for developing ADAS and automated driving solutions. First generation engineering samples are being tested on the road today.