Autonomous and Self-Driving Vehicle News: Venti Technologies, IIHS, Velodyne, Guidehouse, AEye & StradVision

In auotonomous and self-driving news are Venti Technologies, IIHS, Velodyne, Guidehouse, AEye and StradVision.

Venti Tech Success

Venti Technologies, the leader in safe-speed autonomous vehicles, announced that it has successfully deployed two autonomous SUVs built by SAIC-GM-Wuling Automobile (SGMW), the joint venture formed by Chinese automakers SAIC and Wuling, along with GM, achieving another milestone in its path to commercialization.

The two autonomous SUVs were deployed at a school at Nanning City, the capital of Guangxi Province, China. The SUVs, which operate at a maximum speed of 15 km per hour, provide shuttle transportation services to students and visitors, and are easily booked via a hailing app for destinations along a 9-station loop. Venti Technologies’ flexible, algorithmic-based autonomous vehicle technology has been installed in the SUVs which run on a 3K loop in opposite directions. The Company’s sensor configuration eliminates blind spots and is able to operate with mixed traffic and other road users including, for example, cars, scooters and pedestrians. The Venti-enabled SUVs are also able to overtake lower speed vehicles while navigating incoming vehicles from the other direction.

IIHS Calls for Regulation

Federal regulators should expand and strengthen proposed safety standards for self-driving vehicles, the Insurance Institute for Highway Safety and Highway Loss Data Institute said in a recent regulatory comment.

The National Highway Traffic Safety Administration (NHTSA) requested comments on proposed changes to the rules for occupant protection. The changes are intended to account for new designs and seating choices that may come with the introduction of fully automated vehicles. But it would be a mistake to issue those standards without including enforceable regulations to govern the driving behavior of such automated systems, IIHS Chief Research Officer David Zuby wrote in the comment.

IIHS research has shown that manufacturers will need to design self-driving vehicles specifically to prioritize safety over other rider preferences to eliminate the majority of today’s crashes (see “Self-driving vehicles could struggle to eliminate most crashes,” June 4, 2020).

Removing some of the hurdles to the introduction of these vehicles without first establishing rules for how they should drive would be putting the cart before the horse, Zuby said.

In the context of occupant protection, regulators should require that automated systems be designed to refuse to start a journey if an airbag is malfunctioning or any passenger is not properly restrained.

Zuby also argued that the same level of protection currently mandated for the front row should be required for all available seats. When a driver is no longer needed and in some cases vehicles no longer have driver controls, there is no reason to suppose people will sit where they have traditionally. IIHS research shows occupants seated behind the front row are no longer safer than those seated in front, and occupants of some ages are at greater risk in the back (see “Rear-seat occupant protection hasn’t kept pace with the front,” April 25, 2019).

For vehicles without a steering wheel and other traditional controls, the left front seat should be subject to the same requirements that are currently mandated for the front passenger seat. Similarly, in vehicles with bench seats, lap and shoulder belts should be required for the middle seat in all rows.

Guidehouse Insights

A new report from Guidehouse Insights projects the size of global and regional markets for light duty consumer and commercial vehicles with highly automated driving (AD) capability through 2030.

Although 2020 was long projected as the turning point for when automated vehicles would start being widely deployed and adopted, the reality is that AD technology has proven far more challenging to develop and validate than anticipated. In addition to significant technical challenges associated with AD, macroeconomic factors created by the coronavirus pandemic are expected to further slow progress.

“In the near term, automated driving deployment will remain very limited to specific locales where it has been demonstrated to work safely and reliably and there is a demonstrated market willing to adopt the technology,” says Sam Abuelsamid, principal research analyst with Guidehouse Insights. “With much of the global economy shut down through the first half of 2020, companies are also reevaluating their investment priorities and AD is likely to lose out to electrification.”

According to the report, with consumers having less disposable income for the foreseeable future, vehicle sales and ride-hailing services are also likely to take a significant hit. However, even before the global health crisis, companies in the AD sector were rethinking go-to-market strategy and turning toward goods delivery rather than carrying passengers.

The report, Market Data: Automated Driving Vehicles, provides projections of the size of global and regional markets for light duty consumer and commercial vehicles with highly automated driving capability. Baseline, conservative, and aggressive scenarios for market deployment are included as well as market splits among consumer, robotaxis, and goods delivery vehicles. Forecasts for partially automated vehicles are also included. An executive summary of the report is available for free.

Velodyne Sales Agreement with AGROINTELLI

Velodyne Lidar, Inc. today announced a three-year sales agreement with AGROINTELLI, a developer of intelligent farming solutions. AGROINTELLI uses Velodyne lidar sensors in production of its Robotti autonomous tool carriers that increase efficiency on fields and help professional farmers save time and money.

Robotti uses Velodyne’s Puck™ sensors for safe and efficient navigation on farmland. It supports a wide range of equipment such as a precision seeder, mechanical inter-row weeder and seed drill. Robotti performs highly accurate, site-specific agricultural operations including seedbed preparation, seeding, weeding, fertilizing, spraying and mowing.

Robotti is suitable for conventional and organic farms as well as in conservation agriculture. Farmers can let the robot work autonomously or manually control it from a tablet by using an app. Designed with uncompromising attention to safety, Robotti is built to react quickly and accurately should risk of accident arise.

AEYE 4Sight Sensors

AEye, Inc., an artificial perception pioneer today announced that their 4Sight™ M sensor based on patented intelligent perception system design has established a new standard for sensor reliability. In testing completed at NTS, one of the most respected testing, inspection, and certification companies in the US, the 4Sight M scan block surpassed automotive qualification for both shock and vibe. AEye also announced the availability of 4Sight – a new family of advanced 1550nm LiDAR vision systems.

4Sight is the fifth-generation sensor from AEye and is based on AEye’s powerful iDAR™ platform. AEye’s unique patented system design is elegant in its simplicity with one laser, one MEMS, one receiver, and one SOC. Driven by extensible software, 4Sight is designed from the ground up to identify and deliver salient information while exceeding all industry quality and reliability standards, and can be manufactured at scale at low cost. To prove its reliability, AEye recently engaged NTS, to conduct extensive shock and vibration testing, on the 4Sight sensor. The results of the test showed a 4Sight Sensor can sustain a mechanical shock of over 50G, random vibration of over 12Grms (5-2000Hz), and sustained vibration of over 3G.

The size of the mirror in a MEMS largely determines its reliability. Larger mirrors also have larger inertia, generating 10x to 600x more torque from shock and vibration events. In addition, larger mirrors do not allow for fast, quasi-static movement for agile scanning, which is key to intelligent and reliable artificial perception. Learn more here.

The unique patented system design of AEye’s MEMS allows a mirror that is less than 1mm in size. Other LiDAR systems use 3mm to 25mm mirrors – which equates to 10X – 600X larger surface area. (See Figure 1) In addition, lacking intelligence-driven agility, these systems are forced to rely on these larger mirrors increasing both complexity and cost. Combined with an 1550nm amplifiable laser and sophisticated receiver, the small mirrors in AEye’s custom-designed MEMS are produced in volume using standard processes and deliver the unique high-performance of iDAR with ground breaking reliability.

Built on unique patented technologies

AEye’s iDAR platform and 4Sight sensors are backed by over 27 granted patents, 10 more patents in process, and over 1,300 claims covering system and component design and implementation. This broad patent portfolio includes several groundbreaking innovations such as the only scanning lidar patent granted for a camera and lidar sharing the same optical axis (co-boresited), eliminating enormous parallax correction; MEMS agile control, feedback, and intraframe sampling, allowing for edge processing and low-latency feedback; and advanced perception, enabling real-time capabilities such accurate intraframe calculation of object velocity.

Some of the unique features of the 4Sight M are:

LiDAR Performance

  • Software definable range optimization of up to 1,000 meters (eye- and camera-safe)
  • Up to 4 million points per second with horizontal and vertical resolution less than 0.1°
  • Instantaneous addressable resolution of 0.025°

Integrated Intelligence

  • Library of functionally-safe deterministic scan patterns that can be customized and fixed or triggered to adjust to changing environments (highway, urban, weather, etc.)
  • Integrated automotive camera, boresight aligned with AEye’s agile LiDAR – instantaneously generating true color point clouds. Parallel camera-only feed can provide cost-effective redundant camera sensor.
  • Enhanced ground plane detection to determine topology at extended ranges

Advanced Vision Capabilities

  • Detection and classification of objects with advanced perception features such as intraframe radial and lateral velocity
  • Detection through foliage and adverse weather conditions such as rain and fog through the use of dynamic range, full-waveform processing of multiple returns
  • Detection of pedestrians at over 200 meters
  • Detection of small, low reflective objects such as tire fragments, bricks or other road debris (10x50cm at 10% reflectivity) at ranges of over 120 meters

Reliability

  • Shock and Vibration – designed and tested for solid-state performance and reliability. 4Sight has proven in third-party testing to sustain mechanical shock of over 50G, random vibration over 12Grms (5-2000Hz), and sustained vibration of over 3G for each axis.
  • Automotive-grade production:
    • Automotive-qualified supply chain utilizing standard production processes and overseen by global manufacturing partners
    • Designed for manufacturability using a simple solid-state architecture consisting of only 1 scanner, 1 laser, 1 receiver, and 1 SoC
    • Common hardware architecture and software/data structures across all fully autonomous to partially automated applications (ADAS) – leveraging R&D and economies of scale.

Price

  • 4Sight can be configured for high-volume Mobility applications with SOP 2021 at an estimated 2x-5x lower price than any other high-performance LiDAR, additionally for ADAS applications with SOP 2023 4Sight is designed to be priced 1.5x-3x lower than any other long or medium range LiDAR.
  • 4Sight series production packaging options include roof, grill, and behind the windshield, software optimized depending on the placement

“In my work with automotive OEMs, the value delivered by the 4Sight sensor surpasses anything I have seen in the automotive industry,” said Sebastian Bihari, Managing Dire

Given the current constraints in travel, AEye is also announcing another industry-first innovation in the launch of Raptor – a unique high-performance web-based remote demo platform. Raptor will enable participants to engage in a real-time interactive test drive with an AEye engineer. From the comfort of their own home or office, AEye’s customers and partners will have the ability to see what a truly software-defined sensor can do and witness the record breaking 4Sight M performance in real time and to customize the demo to meet their specific use cases.

StradVision’s New EU Manager and Centre

StradVision’s AI-based camera perception software is leading the way toward the future for Advanced Driver Assistance Systems (ADAS) and Autonomous Vehicles (AVs). The Company continues to expand into the European Union with the strategic hiring of a new EU General Manager. StradVision is a member company of the Born2Global Centre.

StradVision is pleased to announce Andreas vom Felde, Ph.D. will join its European team as GM, following the opening of a new regional office in Munich, Germany earlier this year.

StradVision‘s AI-based camera perception software is leading the way toward the future for Advanced Driver Assistance Systems (ADAS) and Autonomous Vehicles (AVs). The Company continues to expand into the European Union with the strategic hiring of a new EU General Manager. StradVision is a member company of the Born2Global Centre.

StradVision is pleased to announce Andreas vom Felde, Ph.D. will join its European team as GM, following the opening of a new regional office in Munich, Germany earlier this year.