Autonomous & Self Driving Vehicle News: Motoional, Lyft, Luminar, NVIDIA, Innoviz, Torc Robotics, Applied Intution, Serve Robotics, Uber, Gatik, Walmart, Daimler Trucks, Perrone Robotics, GreenPower, Plus, Aeva, AutoX, QCraft & StradVision

In autonomous ands self-driving vehicle news are Motoional, Lyft, Luminar, NVIDIA, Innoviz, Torc Robotics, Applied Intution, Serve Robotics, Uber, Gatik, Walmart, Daimler Trucks, Perrone Robotics, GreenPower, Plus, Aeva, AutoX, QCraft and StradVision.

Motional & Lyft  to Offer Autonomous Rides in Vegas

Motional, a global leader in driverless technology, and Lyft, Inc. (Nasdaq: LYFT) announced the planned launch of a fully driverless public ride-hail service in Las Vegas, the first city in a multimarket deployment. Motional’s next-generation robotaxis, the all-electric Hyundai IONIQ 5-based robotaxi, will be available on the Lyft app in Las Vegas, starting in 2023. The deployment is part of a landmark partnership between Motional and Lyft announced last year.

The service will be the first time fully driverless cars for use in a ride-hailing service are available to the public in Nevada. Launched by Aptiv in 2018 and now as Motional, the company has operated a public self-driving service with Lyft in Las Vegas for over three years. The 2023 deployment represents a significant expansion, will use Motional’s next-generation robotaxi, and will be the first time passengers experience a fully driverless Lyft and Motional ride. With plans to launch in multiple markets, the service is also designed to be scalable and positions both Motional and Lyft to introduce millions of riders to driverless technology in the future.

Motional’s president and CEO Karl Iagnemma said the announcement marks an important milestone between the two companies. “Motional and Lyft pioneered collaboration between the ride-hail and driverless industries, and are now laying the foundation for large-scale deployments of driverless robotaxis,” said Iagnemma. “We look forward to beginning this next chapter in Las Vegas, and then quickly scaling to other markets across the Lyft network.”

“Lyft’s powerful network is the ideal platform for deploying autonomous vehicles at scale. Motional’s driverless tech, combined with Lyft’s marketplace engine, brings us firmly into the self-driving future,” said Lyft Co-Founder and CEO Logan Green. “We can’t wait for riders in Las Vegas to be the first to summon fully driverless cars on the Lyft platform.”

The journey to a 2023 launch

Motional and Lyft are collaborating closely in the lead-up to the 2023 commercial launch, iterating on all aspects of the deployment and rider experience (UX). This includes transporting passengers via Motional’s next-generation robotaxi, the IONIQ 5 robotaxi starting in the second half of 2022. During this initial phase, riders will experience their end-to-end journey autonomously, including entering the vehicle, starting their ride, and requesting support. This will allow Motional and Lyft to gain rider feedback, refine the UX to be as intuitive as possible, and more.

The IONIQ 5 robotaxi was developed through strategic collaboration between Motional and Hyundai, is built to safely operate without a driver, and comes with enhanced safety and reliability features. It will be Motional’s second vehicle to operate without a driver on public roads, making the company one of very few industry leaders operating a second-gen driverless platform.

Motional and Lyft continue legacy in Las Vegas

Integrating driverless vehicles onto a ride-hailing network and effectively deploying them is an inherently complex process that requires deep collaboration between the robotaxi fleet provider (Motional) and the network (Lyft). As operators of the world’s most established public self-driving service (Las Vegas; 2018-present), Motional and Lyft have over three years of experience optimizing this service, which represents a unique advantage in the driverless industry.

Las Vegas presents an exciting opportunity as the first city in Motional and Lyft’s multimarket driverless deployment. The dynamic city receives over 40 million visitors annually and is home to some of the world’s largest entertainment, business, and trade events. Motional’s robotaxis will provide safe and reliable transportation to these millions of visitors and Las Vegas community members.

The announcement comes as a new study reaffirms Americans’ interest in experiencing driverless technology. Over a third of respondents believe they will have taken a ride in a driverless vehicle within five years and more than half think AVs could have a positive impact on their community 1. Motional and Lyft believe multimodal ride-hail networks can provide improved and safer transportation options for millions of Americans, and look forward to making that future a reality.

Luminar Tech Part of NIVIIA DRIVE Hyperion

Luminar Technologies, Inc., the global leader in automotive lidar hardware and software technology, announced at the NVIDIA GTC conference that its lidar solution has been selected to be part of the sensor suite in the NVIDIA DRIVE Hyperion autonomous vehicle reference platform. This AI vehicle computing platform accelerates development of autonomous consumer vehicles with planned production starting in 2024.

By offering automakers a qualified, complete sensor suite featuring Luminar’s lidar solution, on top of NVIDIA’s centralized high-performance compute and AI software, DRIVE Hyperion provides everything needed to develop production autonomous vehicles.

DRIVE Hyperion will utilize one forward-facing long-range Luminar Iris lidar in its Level 3 highway driving configuration. Iris’ custom lidar architecture is designed to meet the most stringent performance, safety and automotive-grade requirements to enable next-generation safety as well as assisted and autonomous driving on production vehicles.

Innoviz Part of NVIDIA DRIVE

Innoviz Technologies (Nasdaq: INVZ), a leading provider of high-performance, solid-state LiDAR sensors and perception software, announced its advanced perception solution is now supported on the NVIDIA DRIVE platform.

NVIDIA DRIVE is an open, scalable, software-defined, end-to-end AI platform for the transportation industry to build upon.

Torc Robotics and Applied Intuition Partner

Torc Robotics, an independent subsidiary of Daimler Truck AG and a leader in Level 4 self-driving vehicle software for heavy-duty vehicles, and Applied Intuition, a simulation and software tools provider for autonomous vehicle development, announced a multi-year strategic collaboration to address the challenges of autonomous vehicle development in order to safely commercialize Torc’s autonomous trucks for over-the-road applications.

Torc is developing a Level 4 autonomous system for long-haul trucking in the US. Under Level 4 autonomy, a vehicle is capable of performing driving functions under specified operating conditions without human intervention.

Applied Intuition offers simulation and software solutions that enable safe, cost-effective, and scalable approaches to the development of autonomous systems. Its deterministic, high-fidelity, and physics-based simulation software supports virtual testing of Torc’s algorithms in US highway environments.

“At Torc, safety dictates every aspect of our development including how we test and validate our autonomous technology,” said Michael Fleming, CEO and Founder of Torc. “The Applied team has demonstrated their expertise and has equipped us with tools to accelerate the safe development of commercial trucks in a financially viable way. We’re excited to continue our collaboration with Applied to make our roads safer for society.”

The engagement between the two companies started in early 2020 when the Covid-19 pandemic started. On-road testing, a critical part of autonomous vehicle system development, was temporarily halted to ensure the health and safety of Torc’s staff. However, the team was able to continue advancing autonomous driving capabilities in simulation without disruption. Today, Torc’s test trucks run daily routes on public roads in multiple states, and simulation continues to be a vital validation method for new autonomous software features before real-life testing.

The continuing collaboration will help Torc’s autonomous vehicles prepare for unpredictable, potentially dangerous events in the real-world and will enable Torc as it scales as a global organization. Developing autonomous vehicles is a complex engineering challenge, and algorithms require comprehensive training, testing, and validation. Applied Intuition offers technology that supports multiple types of simulation and development infrastructure.

“Simulation allows our team to test new features and capabilities of the autonomous system on hundreds or thousands of different scenarios in a virtual world,” said Ben Hastings, CTO of Torc. “This means by the time the autonomous truck is on public roads, the autonomous system has already been validated in many of the scenarios we could encounter. Above that, strong simulation capabilities are a pillar on our path to commercialization. With Applied Intuition, we are developing for the long-term with a product in mind.”

Serve Robotics Partners with Uber for On-Demand Robotic Delivery for Uber Eats

Serve Robotics, the leading autonomous sidewalk delivery company, announced a partnership with Uber Technologies, Inc. (NYSE: UBER), the world’s largest food delivery and ridesharing platform. The on-demand robotic delivery service will be available to Uber Eats customers starting in Los Angeles early next year.

Serve Robotics is shaping the future of sustainable, self-driving delivery. Founded in 2017 as the robotics division of Postmates, Serve is now an independent company on a mission to make delivery more affordable, sustainable and accessible for everyone. Guided by its proprietary autonomous technology, the company’s self-driving robots have successfully completed tens of thousands of contactless deliveries in major U.S. cities. Serve is backed by Uber, alongside other leading investors.

“Serve Robotics is looking forward to delivering great convenience for Uber Eats merchants and customers,” said Dr. Ali Kashani, co-founder and CEO of Serve Robotics. “Uber is our first commercial partner and will be a strong source of demand for us as we use contactless delivery to power community commerce at scale.”

Gatik and Walmart Operate Driverless Daily in AK

Gatik and Walmart Inc. (NYSE: WMT) announced that Gatik is operating daily without a safety driver behind the wheel on its delivery route for Walmart in Bentonville, Arkansas, moving customer orders between a Walmart dark store and a Neighborhood Market in its fleet of multi-temperature autonomous box trucks.

Gatik’s deployment with Walmart in the state represents the first time that an autonomous trucking company has removed the safety driver from a commercial delivery route on the middle mile anywhere in the world.

Gatik’s fully driverless operations, which began in August 2021, involve consistent, repeated delivery runs multiple times per day, seven days per week on public roads and unlock the full advantages of autonomous delivery for Walmart’s customers: increased speed and responsiveness when fulfilling e-commerce orders, increased asset utilization and enhanced safety for all road users.

“Through our work with Gatik, we’ve identified that autonomous box trucks offer an efficient, safe and sustainable solution for transporting goods on repeatable routes between our stores,” said Tom Ward, senior vice president of last mile at Walmart U.S. “We’re thrilled to be working with Gatik to achieve this industry-first, driverless milestone in our home state of Arkansas and look forward to continuing to use this technology to serve Walmart customers with speed.”

“Arkansas and Gatik have shifted into the future with Gatik’s self-driving delivery truck,” said Arkansas Governor Asa Hutchinson. “It is fitting that Arkansas, which is home to the greatest retail companies in the world, is the launching pad for this innovation in retail delivery.”

“This milestone signifies a revolutionary breakthrough for the autonomous trucking industry,” said Gautam Narang, CEO and co-founder, Gatik. “Our deployment in Bentonville is not a one-time demonstration. These are frequent, revenue-generating, daily runs that our trucks are completing safely in a range of conditions on public roads, demonstrating the commercial and technical advantages of fully driverless operations on the middle mile. We’re thrilled to enable Walmart’s customers to reap the benefits.”

In December 2020, Gatik and Walmart received the Arkansas State Highway Commission’s first ever approval to remove the safety driver from Gatik’s autonomous trucks, following the completion of 18 months’ successful operations. As part of its roadmap to operating fully driverless, Gatik undertook a comprehensive stakeholder engagement strategy, involving state and local leadership and emergency services, and will continue to hold ongoing informational workshops concerning its ground-breaking autonomous operations.

Since commencing commercial operations in 2019, Gatik has achieved a 100 percent safety record ​​across multiple operational sites in North America (including Arkansas, Texas, Louisiana and Ontario). Gatik focuses exclusively on fixed, repeatable delivery routes to maximize safety, using proprietary, commercial-grade autonomous technology that is purpose built for B2B short-haul logistics. By constraining the operational design domain, Gatik has been able to achieve the safe removal of the safety driver much more quickly compared to other applications, such as passenger transportation or B2C delivery. The complex urban route in Bentonville involves safely navigating intersections, traffic lights and merging on dense urban roads.

As retailers turn increasingly to hub-and-spoke distribution models to meet consumer needs, the middle mile has emerged as a critical component of the supply chain. In the last decade, shorter, urban routes have become more prominent, with 65 percent of all routes under 500 miles and routes under 100 miles growing by 37 percent in the past decade. Gatik’s autonomous solution helps increase efficiencies and meet consumer needs by delivering on the promise of autonomy today.

Daimler Trucks & Platform Science Launches Virtual Vehicle

Daimler Trucks North America (DTNA), in collaboration with Platform Science, a leading connected vehicle platform, announced the launch of Virtual Vehicle™, the first open OEM platform that enables fleets to access telematics, software solutions, real-time vehicle data, and third-party applications directly from their vehicles. In addition, the platform provides the tools to manage those applications, connectivity, and the mobile devices drivers need to use them. Virtual Vehicle represents a platform-first approach that provides customers greater value and a significantly expanded choice of software-enabled services.

GreenPower Motor Company & Perrone Robotics Deliver Next Gen AV Star

GreenPower Motor Company Inc.  a leading manufacturer and distributor of zero emission electric-powered vehicles, together with Perrone Robotics, a leading provider of fully autonomous vehicle (AV) technology and turnkey vehicle solutions for the mobility of people and things, announced their Original Equipment Manufacturer (OEM) agreement. Under this new agreement, the two companies will continue to build on the success of the original AV Star developed for the Jacksonville Transportation Authority (JTA) in 2019. The original AV Star was developed to meet a growing demand in the transit and transportation sector where reliable mobility was a requirement to expand accessibility options for all end-users. Once deployed in 2019, the JTA AV Star became the nation’s first fully autonomous, all electric, ADA-compliant, and FMVSS certified vehicle. The base EV Star vehicle is also Altoona tested and Buy America compliant.

Today with the success of this project, GreenPower and Perrone Robotics have come together to deliver the next generation of the AV Star. Through this OEM agreement, GreenPower’s EV Star, a multi-purpose, zero-emission, min-E Bus will be upfitted with Perrone’s TONY® (short for “To Navigate You”) autonomous vehicle retrofit kit, which will transform the EV Star to the AV Star. Built on the same highly rated chassis as the EV Star and combined with Perrone’s reliable autonomous vehicle platform, makes the AV star one of the most transit ready vehicles capable of autonomy in the market.

“Through Perrone’s leading autonomy solution combined with our purpose-built, all -electric vehicles, we are able to provide the industry with reliable, innovative solutions together. This is the most compelling autonomous transit solution out there, and we are honored to partner with the true industry leader,” said Fraser Atkinson, CEO of GreenPower. “This vehicle is going to be a market leading gamechanger, and we are looking forward to offering market-proven solutions at the national level.

This is an important milestone not only for Perrone and GreenPower, but the industry as a whole,” continued Atkinson. “As we kick off the first day at the American Public Transit Association (APTA) Conference and Expo, we’re excited to showcase this breakthrough new platform to thousands of industry stakeholders at this prestigious event.”

“This new OEM agreement is an important step in delivering large capacity road-worthy AV mobility options for the transit and transportation industry,” stated Paul Perrone, Founder and CEO of Perrone Robotics, Inc. “Our ability to meet evolving customer needs depends on our ability to work with partners like GreenPower with vehicle solutions that fill specific customer demands in the transit space. This is a critical step in the scale of autonomous vehicle solutions across the transit and transportation value chain. Cost-effective, zero-emissions, reliable autonomy that can extend end-user accessibility is the future of mobility. This relationship with GreenPower continues to validate why Perrone Robotics is the autonomy solution of choice, for vehicle manufacturers, customers, and AV passengers alike.”

The AV Star aims to be a leading mobility and transportation choice for cities, municipalities, public and private campuses. The vehicle is capable of carrying up to 16 passengers, is ADA compliant, FMVSS compliant, and can travel at highway speeds. These are important and critical features driven by the vision for the future of autonomous transit.

The AV Star is currently being proved out in the industry through the company’s work with the Jacksonville Transit Authority (JTA) and U2C to undergo testing for transit applications.

Plus Selects Aeva for Sensing

-Plus (formerly Plus.ai), a global provider of self-driving truck technology, has selected Aeva (NYSE: AEVA), a leader in next-generation sensing and perception systems, to supply automotive grade long-range 4D LiDAR for the production of driver-in and fully autonomous trucks powered by the PlusDrive system. Aeva’s high performance LiDAR will help Plus autonomous trucks sense their environment clearly at long ranges, shorten response time in safety-critical situations, and address edge cases leveraging Aeva’s proprietary instant velocity data.

Aeva and Plus have been collaborating since 2019 to equip and validate Plus’s autonomous trucking system with Aeva Frequency Modulated Continuous Wave (FMCW) 4D LiDAR. With the agreement, Plus will use Aeva’s 4D LiDAR sensors to augment its long-range perception in Plus’s commercially available driver-in product starting in late 2022 and leading to its fully autonomous driving system.

Plus is partnered with some of the world’s largest truck makers and freight carriers. Production and delivery of its driver-in autonomous trucking solution, PlusDrive, to customers started in early 2021, with plans for more than 100,000 vehicles to be in service by the end of 2025. The PlusDrive solution is designed for fleets that are looking to improve driver retention, enhance safety, reduce fuel costs, and lower their carbon emissions.

“Our global deployment of automated trucks to fleets commercially at scale requires leading technology that is automotive grade, high performance, and practical. We selected Aeva as our production partner because its 4D LiDAR complements Plus’s state-of-the-art long range perception by adding important instant velocity detection for the safe operation of autonomous trucks, and Aeva shares our commitment to bring autonomous trucks to market,” said Shawn Kerrigan, COO and Co-founder of Plus.

Heavy trucks take much longer to stop than passenger cars. Therefore, an automated trucking sensor system needs to detect objects, place them in lanes, and assign an accurate velocity at very long ranges. Aeva’s 4D LiDAR senses precise velocity and position for each point, even at distances over 500 meters away. The combination of Aeva’s 4D LiDAR and Plus’s proprietary autonomy stack addresses many edge cases, such as previously unseen obstacles that may confound deep neural networks in the perception stack.

“We are pleased that a leader like Plus recognizes the unparalleled performance of our unique 4D LiDAR sensors,” said Soroush Salehian, Co-founder and CEO at Aeva. “This production partnership validates Aeva’s technology as the world’s first automotive 4D LiDAR for autonomous trucking and we look forward to supporting Plus as they ramp up production of their automated trucks.”

Spark Connected Partners with AutoX for Wireles Sensors

Spark Connected, (www.sparkconnected.com) a global leader in developing advanced and innovative wireless power technology announced a partnership with AutoX (www.autox.ai), the global leading autonomous driving company, to bring wirelessly powered sensors to level 4 autonomous vehicles.

Spark Connected has continued to innovate and expand their automotive grade wireless power solutions portfolio well beyond the standard in-cabin smartphone charging. The first-generation Beast 1.0 has already been integrated into model year 2021 and Beast 2.0 was recently introduced with Qi 1.3 capability. The new higher power automotive grade Phoenix is the latest addition to the Spark automotive portfolio, wirelessly powering automotive sensors in self-driving vehicles. These sensors allow AutoX vehicles to perceive and understand the world around them, a critical function in driverless vehicles.

According to Ruwanga Dassanayake, COO at Spark Connected, “Spark Connected has always focused on wireless power technology that has a transformative impact on how we interact with devices as a society. Our deep domain expertise help solve the most complex and challenging technical problems for our customers. Partnering with AutoX, the largest self-driving taxi fleet in Asia has been an absolute pleasure. Their groundbreaking driverless technology platform, combined with Spark’s unique and advanced wireless power technology, allows AutoX to accelerate self-driving vehicles to the market.”

AutoX is leading the RoboTaxi industry in China and abroad: it’s self-driving platform can handle the most challenging and dynamic traffic scenarios in urban cities around the world. AutoX is the first and only company in China operating a fully driverless RoboTaxi service on public roads. In addition, AutoX has obtained the world’s second fully driverless permit from the California DMV.

“Spark Connected’s innovative and unique wireless power technology is key for us because it allows for a robust and durable power source with no mechanical connections. The automotive grade Phoenix solution is in line with our high safety standards, while allowing us the flexibility of mechanical design and the reliability to deliver worry-free power to our vehicle sensors,” said Dr. Jewel Li, COO at AutoX.

Highlights of the AutoX platform:

  • Market leadership. AutoX is the first and only company in China operating a fully driverless RoboTaxi service on public roads.
  • Superhuman safety. AutoX is committed to building a service with safety as the highest priority. Our integrated approach to AI algorithms, hardware, and large-scale simulation enables us to achieve a safety level beyond human drivers.
  • Universal impact. AutoX’s innovative AI travels beyond the lab because it is built for real-world complex traffic scenarios. Our mission is to bring autonomous driving to everyone’s daily life and make it universally accessible to everyone.

About Spark Connected:

Spark Connected | powering the world, wirelessly™

Spark Connected is a global leader in wireless power technology. The company has the broadest portfolio of innovative ready-to-use wireless power solutions ranging from 1 Watt to 2.4 kilowatts.

The company’s patented hardware reference designs, combined with the highly scalable Pantheon™ software platform, allows end-to-end intelligent and adaptive power system control. Spark offers both inductive and resonant technologies. The result is best in class performance, efficiency, safety, thermal management, and EMI.

This proven technology has been successfully integrated into a myriad of customer products in a wide variety of applications, including automotive, industrial, consumer, medical, IoT, security and infrastructure.

Spark Connected is a full member of and has multiple leadership positions with the global Wireless Power Consortium, driving and influencing the global standards and specifications.

QCraft Releases 3rd Gen Drive-by-QCraft with NVIDIA

Recently, QCraft released the 3rd-generation hardware based on its “Driven-by-QCraft” solution. The sensor suite provides compact features of multiple types of advanced high-precision sensors to achieve 360-degree blind spot-free perception with solid stability and real-time performance. As a safety guarantee, every module including sensors, computing platform, power system and communication system are designed with full redundancy.

Most importantly, NVIDIA DRIVE Orin™ system-on-a-chip (SoC) will be adopted by QCraft to boost the next-generation of the Driven-by-QCraft solution. Armed with NVIDIA Orin, QCraft expects to accelerate the progress of its L4 autonomous driving solution to achieve automotive-grade and commercialization.

A hardware solution designed to support multi-vehicle, multi-scenario and multi-city road operations

QCraft currently operates a fleet of around 100 autonomous driving vehicles, powered by the same hardware solution. Up to now, the fleet has reached ten cities globally, including in the Silicon Valley in the U.S., Beijing, Shenzhen, Suzhou and other major cities in China. This hardware solution has been powering ten types of vehicles and is capable of coping with all sorts of scenarios—urban congestion, storms and tunnels, to name a few—with high sophistication.

As the pioneer of robobus in China, Longzhou ONE, QCraft’s first robobus for public roads, has been operated in six cities, including Suzhou, Shenzhen, Beijing, Wuhan, Wuxi and Chongqing, comprising the largest fleet of its kind in China.

In 2020, QCraft launched China’s first 5G robobus project that achieve regular operation in Suzhou. One year later, the company is now expanding its footprint with the launch of a ride-hailing 5G robobus project in the downtown area of Wuxi. By deploying robobuses and three bus routes totaling 15 kilometers (9.3 miles) in the city, QCraft’s initial service connects major shopping centers and subway stations to residential communities, within a region of about 10 square kilometers in the busiest area of the city.

“Driven-by-QCraft”, the powerful autonomous driving solution, supports the firm’s rapidly responsive implementation. This solution has two modules: Onboard Software and Onboard Hardware.

The hardware solution has been developed correspondingly to realize the application of software technology. QCraft owns the full tech stacks for onboard software, including perception, mapping & localization, route planning & decision making, and controlling. The seamless compatibility of hardware and software enables quick deployment and wide applications for different urban scenarios and vehicle models.

The widespread adoption of Longzhou vehicles continuously generates massive amounts of data. Therefore, the capacity to autonomously and efficiently collect and use the data plays a crucial role in accomplishing a rapid advancement of autonomous driving technology. QCraft Dataflow Platform accelerates the autonomous driving development by automating large-scale data collection, cleaning and labeling, as well as by facilitating a data-driven and simulation-based verification and evaluation process that spans all development stages.

Taking safety as the red line

Multi-sensor fusion: 360-degree blindspot-free perception

In order to more stably perceive information of traffic participants, QCraft adopts a multi-sensor fusion method in constructing a sensor system that can achieve 360-degree perception, without blind spots. This multi-sensor fusion suite can be easily deployed and upgraded with the modular design, including two long-range measurement lidars (main lidar), three short-range blind spot-filling lidars (blind spot area lidar), four millimeter-wave radars, nine cameras and one IMU set.

  • 360-degree blind spot-free perception: traditional sensor solutions are prone to blind spots, which can be dangerous for high-speed cars and large buses. QCraft has launched the first 360-degree blind spot-free sensor solution in China, solving the problem of dead angles around the vehicle for the first time. Sensors can be redundant to each other, covering an area even less than 10cm from the car.
  • Left-right mutual redundancy of the sensor suite: Unlike a cell phone, TV or computer, the malfunction of an autonomous car can cause fatal accidents. With this in mind, we have built “multi-insurance” to make the sensor suite redundant. Based on three groups of sensors, even if one or two of them fail, the autonomous driving system can still ensure the normal operation of the perception module and will allow the vehicle to stop safely.
  • High-synchronization lidar solution: The sensor suites are installed in three groups. The lidars of each group always rotate simultaneously in the same direction, to create a super high degree of synchronization. Dislocation and ghosting of the point cloud will be avoided when there are dynamic objects in the vicinity. It is ensured that all the point cloud data can be collected and processed at the same time to maximize the use of all information.
  • Camera’s intelligent environment adaption: Through advanced software algorithms, the system can deal with either overexposure or underexposure under different light conditions and can solve the problem of smearing caused by motion blur while driving. The camera specially designed to identify traffic lights can accurately identify the shape and color of traffic lights 150 meters away at night. In addition, the high-resolution camera meets the requirement of vehicle manufacturing standards to deal with extreme environments. It can operate in temperatures ranging from -40 to 125°C.
  • Minimization of camera blind spots: Based on seven surround-view 5-megapixel cameras, QCraft expands the vertical perception range by turning the camera at a 90-degrees angle, which can significantly reduce camera blind spots by more than 90%. The camera can distinguish small objects at a close range, such as cone barrels and children. Also, it ensures the consistency between the line-by-line exposure direction of the camera and the scanning direction of the lidar. It therefore improves the front fusion effect of cameras and lidars.
  • The outdoor temperature fluctuation and rainy weather will condense water on the camera lens, blurring the image. The sensors of QCraft have a self-cleaning function, which can automatically remove water mist, dust, and other dirt.

Computing platform: Three levels of mutual redundancy design

  • The computing platform of the Driven-by-QCraft solution includes the central computing unit, the backup computing unit and the on-board computing unit. Under normal circumstances, the central computing unit is responsible for processing the software. If this fails for any reason, the backup computing unit will take over vehicle control immediately and determine its movement. The redundancy design allows the vehicle’s protection mechanisms to pull over to the side of the road or brake during an emergency.

Power system: Comprehensive power-path delamination and protection design

  • With layered power path management, power can be dynamically allocated according to real-time weather and road conditions, giving priority to supporting the core function modules and regulating the power supply of auxiliary function modules. This kind of management can help to:
    • extend the vehicle’s operating mileage;
    • effectively identify and isolate abnormal fault units within the system, avoid fault cascades, protect core functional modules from being affected by random failures, and reduce operation and maintenance costs;
  • Combined with a redundant power supply and the sensor suite, the power system maintains the minimum subsystems that keep the vehicle safe to drive, even when there is accidental damage to a single or even multiple core functional modules.

Build for scaling up

Multi-sensor fusion suite: Suitable for different models, and cost-controllable

  • The hardware solution introduced by QCraft is available for different vehicle models and is the very first hardware solution that can be used for both robotaxis and robobuses. The single hardware solution can also be applied to different vehicle models and different cities, helping to integrate data from the whole fleet and bringing significant convenience to form a closed data loop. With tremendous amounts of data from universal hardware, QCraft can accelerate the software iteration and continue to upgrade the OTA among all vehicles in the fleet.
  • Furthermore, the hardware solution can also be configured according to the various needs of diverse scenarios. Depending on the forecast of QCraft, the cost of a sensor suite will drop under 100,000 yuan in the next two to three years.

Computing platform: Aiming to achieve auto grade

  • QCraft has also announced the adoption of NVIDIA’s Orin product in the most updated-generation hardware.
  • Among current automotive AI chips, NVIDIA’s Orin X chip is at the top of the pyramid, having been called “the strongest AI chip for intelligent driving on the earth.” Relying on its computing power, NVIDIA’s Orin X chip can perform massive concurrent operations with excellence, and can support complex deep neural networks to process data generated by the autonomous driving system to make decisions.
  • In addition, NVIDIA’s Orin X chip meets the auto grade and complies with the ASIL-D standard in ISO 26262 at the system level, which is very important for application scenarios containing strict safety requirements.
  • The power consumption of computing platforms based on Orin chips ranges from 2 to 3 TOPS/W. In turn, this efficiency is helpful in the large-scale implementation of L4 autonomous vehicles.
  • With the gradual commercialization of advanced driver assistance systems, the affordable price of NVIDIA’s Orin X chip brings great benefits by reducing the overall cost of the entire autonomous driving solution.

The custom-on-demand “Driven-by-QCraft” solution targets for efficiently deploying autonomous vehicles across a plurality of settings. Longzhou autonomous vehicles rolled out by QCraft can be applied to many scenarios, including online ride-hailing, buses and shuttle cars.

When intelligent, connected vehicles evolve away from just private ownership in favor of shared mobility, robotaxis move towards new formats, providing larger space and greener mobility services. To create a more forward-looking robotaxi, QCraft has started from robobus, exploring new forms that are more applicable to shared mobility.

Moreover, QCraft also provides customers with the toolchain for autonomous driving technology research and development. It helps clients with their data-driven algorithm development, which can quickly improve their autonomous driving systems based on their own data closed loop.

Looking to the future, QCraft plans to apply “Driven-by-QCraft” to more smart transport scenarios, along with more Longzhou models and sustainable tech advancements, aiming to finally make autonomous driving a reality.

StradVision to Demo in Auto Tech Week in Novi

StradVision, a leader in computer vision technology for Autonomous Vehicles and ADAS systems, is demonstrating its latest technologies on-site Nov. 16-17 at Automotive Tech Week 2021, which will be held at the Suburban Collection Showplace, 46100 Grand River Avenue in Novi.

  • Depth-map Solution: The latest feature implementing innovative Pseudo LiDAR technology, which replaces high-cost and high-performance LiDAR equipment. Offering the high precision of distance measurement to an object with only a mono-channel camera
  • Semantic Segmentation: A technology that classifies objects by analyzing the images acquired through the vehicle’s camera on a pixel-by-pixel basis through deep learning technology
  • Multi-camera 360-degree perception:  This technology uses up to 9 cameras, which is critical to implement autonomous driving features of L3 or above, such as Automated Valet Parking (AVP), and Enhanced Autopilot

“I look forward to our time at Automotive Tech Week 2021, and showcasing the innovative technologies StradVision is unveiling at the event,” Sunny Lee, StradVision’s Chief Operating Officer who is leading StradVision’s team at the event said. “As we continue to ramp up our presence in Michigan, the heart of U.S. auto industry, events like this let us get our product in front of key industry leaders who are on board with our mission of using technology to achieve the safest possible experiences with ADAS systems and autonomous driving.”

StradVision’s SVNet is a lightweight software that allows vehicles to detect and identify objects accurately, such as other vehicles, lanes, pedestrians, animals, free space, traffic signs, and lights, even in harsh weather conditions or poor lighting.

Toposens Partners with Infineon

Toposens has partnered up with Infineon Technologies AG (FSE: IFX / OTCQX: IFNNY) to realize 3D obstacle detection and collision avoidance in autonomous systems using Toposens’ proprietary 3D ultrasound technology. The Munich-based sensor manufacturer offers 3D ultrasonic sensors ECHO ONE DK that leverage sound, machine vision, and advanced algorithms to enable robust, cost-effective and accurate 3D vision for applications such as robotics, autonomous driving and consumer electronics.

The easy-to-integrate 3D ultrasonic sensor enables safe collision avoidance through precise 3D obstacle detection. It is based on Infineon’s XENSIV™ MEMS microphone IM73A135V01. This next-generation reference product allows customers to reduce their development efforts and time-to-market. In addition, it is low cost and energy efficient compared to existing industrial 3D sensors. The new technology is ideal for improving the performance of automated guided vehicles (AGVs).

“Our XENSIV MEMS microphones enable the detection of sound pulses, so they are a critical component for 3D object localization via ultrasound,” said Dr. Roland Helm, Vice President and Head of Sensor Product Line from Infineon. “They offer a combination of exceptionally low noise and the highest SNR (signal-to-noise ratio) in the industry, resulting in improved reliability of the 3D data. This allows the detection of even the faintest ultrasonic echoes from distant, complex and small objects.”

“Making use of Infineon’s MEMS microphone, we were able to realize our new ultrasonic 3D sensor with a high overall sensitivity in the ultrasonic frequency spectrum, giving us the best range and widest opening angle,” said Tobias Bahnemann, CEO and co-founder of Toposens. “This enables our AGVs, robots or other applications to avoid collisions with all kinds of obstacles, even in the harshest environments, as proven by the IP57 protection rating.”

Typically, lighting conditions, reflections and weather affect performance of existing sensor technologies. Toposens sensors, however, rely on echolocation to generate real-time 3D point clouds. This guides autonomous systems in even the most challenging conditions and allows consumer electronics to recognize their surroundings. The ultrasonic echolocation sensor enables 3D multi-object detection, which is critical for collision avoidance, with low calibration effort and high reliability and robustness. In addition, ultrasonic sensing reduces the high number of false positives and false negatives that can occur when using optical sensors and decreases the efficiency of the system.

Availability

Toposens is offering customers the new platform ECHO ONE DK for an easier and more flexible evaluation of the sensor in their products. For easy integration, CAN is offered as communication interface as standard, others can be offered per request. There are three software packages: a C++ library, ROS support and a cross-platform 3D data visualizer. In addition, a separate interface adapter is available for firmware updates.

 

COMMENT: Let Us Know What You Think