Autonomous and Self-Driving Vehicle News: Mobileye, Transdev, Magna & Neural Propulsion Systems

In autonomous and self-driving vehicle news are Mobileye, Transdev, Magna and Neural Propulsion Systems.

Mobileye, Transdev & Lohr Partner for Autonomous Shuttles

Mobileye, an Intel Company, Transdev Autonomous Transport System (ATS), part of Transdev Group dedicated to autonomous mobility solutions; and Lohr Group, a mobility solutions manufacturer, announced a strategic collaboration to develop and deploy autonomous shuttles. The companies are integrating Mobileye’s self-driving system into the i-Cristal electric shuttle, manufactured by Lohr Group, with plans to integrate it into public transportation services powered by fleets of self-driving shuttles across the globe, starting in Europe.

“Our collaboration with Transdev ATS and Lohr Group serves to grow Mobileye’s global footprint as the autonomous vehicle (AV) technology partner of choice for pioneers in the transportation industry,” said Johann Jungwirth, vice president of Mobility-as-a-Service at Mobileye. “Mobileye, Transdev ATS and Lohr Group are shaping the future of shared autonomous mobility, and we look forward to bringing our self-driving solutions to regions all over the world.”

“This collaboration between Transdev ATS, Lohr Group and Mobileye will enable to deploy autonomous vehicles in public transportation networks at scale, thanks to the combination of the complementary cutting-edge technologies and strong industrial expertise of the three partners,” said Patricia Villoslada, executive vice president of Transdev ATS. “Together we will bring new mobility solutions to reality in the next coming years.”

“The collaboration between Transdev ATS, Mobileye and Lohr Group is set to provide fully industrialized autonomous shuttles at scale to support the urban autonomous vision,” said Marie-José Navarre, vice president of Lohr Group. “Our common goal is to quickly provide to clients autonomous shuttles that could be easily and efficiently implemented in cities.”

By integrating the autonomous i-Cristal shuttle into Transdev’s existing mobility service networks, the companies aim to improve the efficiency and convenience of mass transportation solutions. Autonomous mobility can be woven into the fabric of transportation networks to distribute service when and where it’s needed, while also optimizing the fleets, lowering transportation costs and improving customer experiences.

Over the next year, Mobileye will work with Transdev ATS and Lohr Group to integrate and deploy i-Cristal autonomous shuttles leveraging Mobileye’s AV technology, Transdev ATS’s technology and Lohr Group’s industrial expertise. The three companies will initially test vehicles on roadways in France and Israel, aiming to ready technology designs for production by 2022. The companies expect to deploy self-driving i-Cristal shuttles in public transportation networks by 2023.

Through the collaboration, Mobileye and Transdev ATS will bring their technologies into the electric i-Cristal shuttle, manufactured by Lohr Group, which features space for up to 16 passengers and is fully accessible via a ramp. The shuttle can travel at speeds up to 50 kilometers per hour and is designed to safely and efficiently operate within today’s public transportation networks with Transdev ATS’ solutions. These solutions integrate Transdev ATS’ technology like the AV Supervision and expertise in deployment and operation services for public transportation operators and cities. The objective is to allow self-driving technology to become a daily reality.

Mobileye’s self-driving system is a turnkey AV solution that delivers safety via two core concepts: Mobileye’s formal Responsibility-Sensitive Safety model for the safety of the system’s decision-making, and a perception system featuring True Redundancy™ whereby two independent subsystems (cameras and radars+lidars) combine to enable robust perception. The self-driving system can also be deployed without geographical limitation thanks to Mobileye’s Road Experience Management™ AV mapping technology through which a proprietary, crowdsourced AV map of the global road network is created and then continuously and automatically updated using data gathered from mass-market advanced driver-assistance systems.

Magna 3D Surround-View for Level 2+

Global automakers will soon be able to offer 3D surround-view systems in more vehicles, thanks to Magna’s new surround view cameras and electronic control units. Starting with 2022 model years and proliferating across multiple customers and vehicle platforms, Magna’s next generation cameras and domain controllers will help make the benefits of 3D surround view – a driver-assistance technology found mainly in luxury-class vehicles – available to more consumers.

Magna’s multi-camera system provides a high-resolution, 360-degree field of view around the vehicle. The domain controller then creates a 3D surround-view image by processing the four camera images, rendering a seamlessly stitched 3D view of surroundings in relation to the vehicle. The system helps drivers park in even the tightest of spaces and provides an enhanced level of comfort and convenience.

Magna’s system also delivers data to help improve the performance of other vehicle systems such as emergency braking, and automatically detects and warns the driver when the camera lens is impeded by snow, ice, dirt or raindrops.

“By providing a high-performance surround view platform that is cost-effective, we’re helping our customers bring added safety and convenience to more drivers,” said Uwe Geissinger, Magna Electronics President. “This broader on-the-road experience serves as an excellent enabler for future levels of autonomous driving that will require advanced 360-degree camera performance and full system integration.”

Magna provides advanced driver-assistance systems to automakers around the world, with a focus on the Level 2/2+ systems that serve as the building blocks for future autonomy. Magna-made ADAS can now be found on more than 250 vehicle models, providing features that improve the daily commute and add a layer of driver safety.

Ibeo Solid State Sensors for great Wall Motors

The LiDAR sensor specialist from Hamburg Ibeo Automotive Systems GmbH gets underway with the validation of their real solid state ibeoNEXT sensors for China’s largest SUV manufacturer, Great Wall Motor (GWM). The aim is to reach series production for automated driving at L3 and potentially L4 levels, and to define standards which should make autonomous driving possible in the near future. Ibeo commissioned German-Chinese testing and validation provider LiangDao for series testing. The test tracks are located in Germany (Berlin) and China (Beijing). Ibeo itself has already been testing automated and networked driving on public roads since 2017, and is also working together with TÜV SÜD to improve the safety of components and systems used in autonomous vehicles.

Ibeo reaches an important milestone in preparing for mass production of its ibeoNEXT sensor, as validation begins for L3 automated driving features for the Chinese market. This comes against the background of Ibeo’s contract as the world’s first series supplier of solid-state LiDAR for China’s largest SUV and pick-up truck manufacturer, Great Wall Motor Company. The newly developed ibeoNEXT solid state LiDAR is used in the SUV model Wey. Ibeo has commissioned ZF Friedrichshafen AG to produce the sensors and the control unit.

“A Proof of Concept in the respective markets—in this case, China—is important so that we can complete data compliance with the support of our Chinese partner, obtain abstracted driving behavior data and optimize test validation tools that is being used for processing according to the Chinese guidelines,” explains Dr. Dietmar Fiehn, Project Manager of “Silk Road” at Ibeo. “The POC is an important milestone on the way to series production of the ibeoNEXT. We have also established a long and close partnership with LiangDao, which means we are very familiar with the Chinese market and its requirements.”

Neural Props Launches NPS 500

Neural Propulsion Systems (NPS), a pioneer in autonomous sensing platforms, temerged from stealth to launch NPS 500™, the safest and most reliable platform for autonomous vehicles that enables the industry to reach Zero Accidents Vision. NPS 500 is the world’s first all-in-one deeply integrated multi-model sensor system focused on Level 4/5 autonomy.

The radically new sensor-fused system precisely interconnects the NPS revolutionary solid-state MIMO LiDARTM, super resolution SWAMTM radar and cameras to cooperatively detect and process 360° high resolution data giving vehicles the ability to prevent all accidents. The densely integrated sensor system enables vehicles to see around corners and over 500 meters of range with ultra-resolution accuracy together with highly adaptive frame rate. The NPS 500 breakthrough capabilities make it 10X more reliable than currently announced sensor solutions.

“LiDAR, radar and cameras will all play significant roles in creating the ideal autonomous driving platform and there is no question that tightly connected sensors with onboard data fusion for automated driving enables more functionalities,” said Pierrick Boulay, senior analyst at Yole Développement. “This direction is unique and is likely to succeed in a market that could reach $25B in 2025* for sensing and computing in both ADAS and robot vehicles.”

“Our goal to prevent all transportation accidents is the holy grail for autonomous vehicles,” said Behrooz Rezvani, founder and CEO of NPS. “We are the sensing system behind the Zero Accidents Platform for large volume deployment at affordable cost. Existing technologies are not sufficient to achieve this paradigm, so we created our own more powerful LiDAR and radar. Our AI-driven sensor-fusion system processes this ultra-high resolution data to create the safest and most reliable solution in the market today. The NPS 500 slashes time-to-market for autonomous vehicle manufacturers, while being the most cost-effective.”

NPS 500 Product Details
The NPS next generation precision-built, multi-modal sensor system is the industry’s most advance autonomous driving solution that addresses physics-based limitations of each sensory system. The NPS 500 enhances and combines the strengths of LiDAR, radar and cameras to create a platform that leverages the capabilities of each technology, while addressing today’s challenges of Level 4/5 autonomy, including:

  • Cameras: Provide high resolution images, but lack depth information and depend on lighting conditions
  • Radar: Measure velocity with great precision, but have lower resolution than LiDAR and are vulnerable to interference from other radars
  • LiDAR: Provide super precision depth information, but its performance and reliability degrade in adversarial, weather and light conditions, and it can get occluded fairly easily

NPS 500 is the world’s first all-in-one deeply integrated multi-model sensor system focused on Level 4/5 autonomy.

Features:

  • LiDAR: Revolutionary new solid-state MIMO-LiDARTM architecture doubles range to ≥ 500 meters with super resolution and adaptive multi-beam search
  • Radar: New class of radar technology with 10X better detection reliability, simultaneous, multi-band 360° FoV, 70X better against other radar signal interference
  • Software: First ever AI fusion technology to “see-around-the-corner”
  • Chips: 650 Tb/s sensor-data processing on network of tightly connected custom signal processing chips

Benefits:

  • Range ≥ 500 meters @ 10% reflectivity
  • Doubling the reaction time currently available LiDAR
  • Significant increase in sensor data reliability
  • See-around-the-corner capabilities
  • Anticipating pedestrian’s movement well before reaching cross section
  • Detecting moving objects approaching intersections well in advance
  • Built-in redundancy for maximum reliability in harsh environments, bad driving and tough terrains
  • Low maintenance, automakers can efficiently rely on NPS sensors once the vehicles leave the dealership
  • Multi-beam adaptive scan up to 100 FPS to detect and track subtle movements
  • See thru occlusion
  • Reduced time to market
  • Cost effective
  • Low CAPEX and OPEX for OEM customers