CR Rates Automated Driving
Consumer Reports tested automated driving systems declaring GM’s Super Cruise more capable than Tesla’s Autopilot which came in a distant second. CR like the driver monitoring infrared camera. Meanwhile Musk recruited Beta Testers for its full-self driving-mode.
Hands-Free Drive Assist Coming on 2021 Ford-150 Mustang Mach-E
Ford developed Active Drive Assist based on advanced computing of camera and radar sensing technologies to provide real-time hands-free driving opportunities. The technology also enables expanded hands-free driving zones in the future based on system and customer patterns.
The advanced new driver assist feature will arrive first on 2021 F-150 and 2021 Mustang Mach-E, included as standard on certain models or as relatively affordable option on others, with both vehicles becoming available to customers in late 2020. Ford expects to sell more than 100,000 F-150 and Mustang Mach-E equipped with Active Drive Assist technology hardware in their first year of alone based on company sales and take-rate projections.
“As breakthroughs in new technology allow us to help reduce the stress of long highway drives, it’s important to make sure these capabilities can be enjoyed by the largest spread of people possible,” said Hau Thai-Tang, chief product platform and operations officer, Ford Motor Company. “Active Drive Assist can help improve the driving experience while ensuring people remain aware and fully in control, all for a price unmatched by our competitors – a commitment to affordable innovations that has driven us since Henry Ford put the world on wheels.”
High tech priced right
When Active Drive Assist is not equipped as standard, it will be priced competitively, including:
- For F-150, Active Drive Assist will be available as a part of the Ford Co-Pilot 360 Active 2.0 package for $1,595. The Ford Co-Pilot 360 Active 2.0 package is standard on F-150 Limited and available as an option on Lariat, King Ranch and Platinum models.
- For Mustang Mach-E, it will come standard on CA Route 1, Premium and First Edition variants. It’s also available on Select trims for $3,200 as part of the larger Comfort and Technology package, which includes features such as a 360-degree camera, heated front seats and heated steering wheel.
For customers purchasing F-150 and Mustang Mach-E at this year’s launch, the hardware enabling Active Drive Assist – including forward-facing camera and radar sensors – will be available through the Ford Co-Pilot360 Active 2.0 Prep Package, while customers choosing to purchase the software for $600 will receive it through an Over-the-Air Update in the third quarter of next year.
Over-the-Air Updates are quick and easy wireless upgrades that can help enhance quality, capability and improve the ownership experience over time while reducing dealer trips. This will be an early demonstration of the Ford system’s bumper-to-bumper update capability to wirelessly update nearly all vehicle computer models, enabling the addition of this type of complex innovations that require software upgrades to vehicle functions.
For example, early F-150 customers can purchase the prep package that includes the Active Drive Assist hardware and Active Park Assist 2.0 even more affordably for $895, which includes a $100 early adopter incentive. When Active Drive Assist is ready to launch with software updates, customers will then be able to purchase the software – plus a three-year service period – for $600 and receive it via Over-the-Air Update.
In the second half of 2021, new customers will be able to purchase the hardware and software together in the Ford Co-Pilot Active 2.0 package, without the need for an Over-the-Air Update to initiate the feature.
By offering innovative new technology on its most popular, mainstream nameplates, Ford expects to quickly expand the number of vehicles on the road equipped with hands-free driving technology based on company sales projections. This includes approximately 80 percent of Mustang Mach-E vehicles that are expected to be equipped with the technology.
Ford plans to continue adding mapped areas to Active Drive Assist in the future, enabling hands-free driving on even more roads and highways. After a three-year service period, customers can choose to purchase this competitively priced connected service to continue enjoying Active Drive Assist and receive new improvements via Over-the-Air Update.
How it works
Available Active Drive Assist builds upon available Intelligent Adaptive Cruise Control with Stop-and-Go Lane Centering and Speed Sign Recognition. It allows you to operate your vehicle hands-free while the driver is monitored by a driver-facing camera to make sure you’re keeping your eyes on the road, with the potential for more enhancements in the future. This feature is available on prequalified sections of divided highways called Hands-Free Zones that make up over 100,000 miles of North American roads.
An advanced driver-facing camera will track eye gaze and head position to ensure drivers are paying attention to the road while in Hands-Free Mode as well as when they’re using hands-on Lane Centering Mode, which works on any road with lane lines. Drivers will be notified by visual prompts on their instrument cluster when they need to return their attention to the road or resume control of the vehicle.
As part of the available Ford Co-Pilot360 Active 2.0 package, customers will also receive Active Park Assist 2.0, the latest iteration of park-assist technologies to give drivers some peace of mind when parking their F-150 or Mustang Mach-E. With Active Park Assist 2.0, simply holding a button will allow the vehicle to take control of parking in parallel and perpendicular spaces with ease. It also offers Park Out Assist with side-sensing capability so drivers can confidently navigate out of a parking spot when someone’s parked too close.
*Active Drive Assist is a hands-free highway driving feature. The Active Drive Assist Prep Kit contains the hardware required for this feature. Software for the feature will be available for purchase at a later date.
*Active Drive Assist functionality expected 3rd quarter 2021CY. Separate payment for feature software required to activate full functionality at that time.
Daimler Trucks & Waymo MU
Daimler Trucks and Waymo have signed a broad, global, strategic partnership to deploy autonomous SAE L4 technology. Their initial effort will combine Waymo’s industry-leading automated driver technology with a unique version of Daimler’s Freightliner Cascadia, to enable autonomous driving.
Waymo brings over a decade of experience building the World’s Most Experienced Driver™, having driven over 20 million miles on public roads across 25 U.S. cities and 15 billion miles in simulation. Daimler Trucks North America, Daimler Trucks’ U.S. subsidiary, parent company of the Freightliner brand and the U.S. market leader in commercial vehicle manufacturing, provides their experience in developing state of the art Class 8 vehicles.
Both Waymo and Daimler Trucks share the common goal of improving road safety and efficiency for fleet customers. The autonomous Freightliner Cascadia truck, equipped with the Waymo Driver, will be available to customers in the U.S. in the coming years. Waymo and Daimler Trucks will investigate expansion to other markets and brands in the near future.
Luminar Works with Daimler Trucks
Luminar Technologies, Inc., the global leader in automotive lidar hardware and software technology, and the world’s largest commercial vehicle manufacturer, Daimler Truck AG, announced a strategic partnership to enable highly automated trucking, starting on highways. Experts at Daimler Trucks, its U.S. subsidiary, Daimler Trucks North America (DTNA) and Torc Robotics, part of Daimler Trucks’ Autonomous Technology Group, with the experts at Luminar are collaboratively pursuing a common goal of bringing series-produced highly automated trucks (SAE Level 4) to roads globally. The teams will work closely together in order to enhance lidar sensing, perception, and system-level performance for Daimler trucks moving at highway speeds. To strengthen the partnership, Daimler Trucks has acquired a minority stake in Luminar.
Dr. Peter Vaughan Schmidt, Head of Autonomous Technology Group at Daimler Trucks: “Luminar has pioneered a critical enabling technology for bringing automated vehicles to the road, and we’re excited to work closely with them to drive this technology forward. Their company has proven visionary in its focus and unique ability to enable long-range sensing and high-speed driving on the highway. Our common goal is to enable safe deployment of highly automated trucks and shape the future of the trucking and logistics industry at large.”
The autonomous trucks are expected to yield dramatic improvements in efficiency and safety of logistics, with an initial focus on long-haul routes on highways. This constrained application of autonomy enables the technology to be commercially deployed in series production on nearer term time frames compared to urban autonomous driving development.
“Our partnership with Daimler Trucks is spearheading the next era of commercial transportation, taking the multi-trillion global trucking and logistics industry head-on,” said Austin Russell, Luminar’s Founder and CEO. “The business case for autonomous trucking is incredibly strong, and now is seeing the first OEM program to bring it to the world.“
Michael Fleming, CEO of Torc Robotics: “We are excited by the opportunity to work with Luminar and their long-range, high resolution Lidar to improve truck safety and enable us to commercialize self-driving trucks. This is a critical, enabling technology on our development path.”
The partnership between Luminar and Daimler Trucks will extend beyond providing critical automotive technology solutions. As part of their joint commitment to safety, the companies will also collaborate on safety standards and operating practices, and make future policy advancements and safety enhancements as a result of the joint program.
Lexus TED Fellows
As the world builds toward autonomous driving, Lexus continues to keep the focus on what inspires the brand—people. Lexus partnered with the TED Fellows program, a global multidisciplinary group of thought leaders, to develop new designs for autonomous vehicles that prioritize people over technology. TED Senior Fellows neuroscientist Greg Gage and artist Sarah Sandman debuted their designs for human-centric autonomous vehicles in an exclusive virtual event today, and two short films spotlighting their ideas are live now: “Predicting Human Needs with Neuroscientist Greg Gage” and “Building Community with Artist Sarah Sandman.” In an interactive discovery session, TED Senior Fellow Samuel “Blitz” Bazawule explored how inspiration can be found in sound.
As a neuroscientist and engineer, TED Senior Fellow Greg Gage believes autonomous vehicles and long commutes make for the perfect environment for neurotechnology to create a “brain-car” interface that allows the automobile to be integrated directly into how the driver feels. By recording a range of human signals, from human EKG to face microgestures, Gage envisions that Lexus could create a profile of a person’s mood and use it to change the ambience of the car. Sleepy? The car will change the lights and seating position. Relaxed? The car will cue up a chill playlist. Stressed? The radio is silenced and lights up. If current cars can monitor engine temperature, oil pressure and engine speed, then a human-inspired car could monitor the human and respond accordingly.
As an artist and designer, TED Senior Fellow Sarah Sandman foresees a future where people are even more buried in their digital devices, and the simple gesture of waving to a pedestrian is lost. With community in mind, Sandman has envisioned a vehicle that connects the inside to the outside world with 360-windows, fully rotational seats, a speaker system for interaction with cyclists or pedestrians, and even a chalk-writing system to leave custom art or messages in the street. To increase quality time with passengers, the interior would mimic a cozy cafe with a digital fireplace, pillows and a terrarium-like ceiling. Sandman also proposes a cooperative ownership model that increases affordability and makes for a more inclusive, meaningful future.
“TED Fellows program supports a community of 492 Fellows from 99 countries in every discipline—from design and activism to astrophysics and neuroscience—working to create a positive impact in their communities. TED’s partnership with Lexus highlights the essence of what the TED Fellows program stands for—curiosity, new ideas and building a world together that we all want to live in,” said Shoham Arad, Director of the TED Fellows program. “We are excited to see the imaginative work of TED Fellows come to life through this TED + Lexus virtual conversation, fun discovery session and spotlight films. What does the future of human-centered design look like? Even in an autonomous universe? Watch to find out.”
Following the discussion with Gage and Sandman, musician and filmmaker Blitz led a discovery session during which he invited attendees to source sounds from within their own homes to create a unique and inspired melody/beat.
Lytx New Tech
Lytx, a leading provider of machine vision- and artificial intelligence-powered video telematics, analytics, safety, and productivity solutions for commercial, public sector, and field service fleets, announced several new technology capabilities that build upon its ability to quickly, comprehensively, and accurately identify driving risk.
Using precise, cutting-edge technology, this new driver-powered approach to safety is a simple, but powerful way for drivers to be more proactive and accountable for their own improvement, while giving management the necessary visibility and data to effectively monitor and intervene if needed.
The Next Generation of Driver-Powered Safety
Lytx is adding a number of new capabilities to its safety offerings in support of this driver-powered approach, including:
- “Inattentive” trigger, which uses proprietary machine vision and artificial intelligence (MV+AI) to detect when the driver’s attention may be unfocused or the driver may be experiencing a condition such as fatigue or drowsiness without the reliance on an accelerometer event
- Real-time in-cab alerts for five different risky driving behaviors: cell phone use, eating and drinking, smoking, no seatbelt, speeding, and inattentiveness
- Behavior duration reporting, which uses MV+AI to track and quantify both the duration and percentage of drive time a driver was engaged in a risky driving behavior, providing a more holistic view of persistent risk
With these new MV+AI-powered updates, when an event is detected, the DriveCam® Event Recorder will issue a real-time in-cab alert to help drivers recognize and address their own risky behaviors and self-correct in the moment. Depending on the behavior, the alert will include a light and/or spoken phrase. With Lytx’s ability to detect more than 60 driving behaviors with greater than 95% accuracy, its in-cab alerts are some of the most precise and actionable in the industry.
Humanising Autonomy Working with Ambarella
Humanising Autonomy, a predictive AI company, announced today it is working with Ambarella, Inc. to deliver cutting edge perception and human behaviour analytics for advanced driver assistance systems (ADAS), autonomous vehicles (AVs), and consumer dash cameras. Integrating Humanising Autonomy’s HAxEdge intent prediction engine and Ambarella’s CVflow® chip range enables advanced vulnerable road user (VRU) perception in a variety of automotive cameras, with the solution available now for immediate deployment.
The solution combines Ambarella’s deep understanding of core computer vision algorithms with Humanising Autonomy’s specialised knowledge of human intent prediction, bringing an exceptional, automotive grade, ASIL B-rated camera solution to market. Device manufacturers and automakers can improve safety critical functionality of ADAS and AV systems with the HAxEdge optimized on the CVflow range of SoCs. The low-power, modular solution can be deployed for a variety of use cases including: real time intent prediction for forward collision warning, blind spot detection, automated emergency braking and adapted cruise control functionality.
Motional Partners with Via
Motional, a global driverless technology leader, and Via, the leader in public mobility solutions, announce a first-of-its-kind partnership that will serve as a blueprint for the future of on-demand, shared robotaxis. Motional and Via seek to use learnings from the partnership to inform how driverless vehicles can be integrated into mass transit networks, and optimized for pooled rides.
The partnership’s first service — shared robotaxi rides, available to the public — is expected to launch in one of Motional’s existing U.S. markets in the first half of 2021, expanding access to safe, affordable, and efficient transportation options. The initial launch market will be announced at a later date.
Via and Motional share a long-term vision to change how the world moves, and are building the digital infrastructure to help self-driving vehicles reach their full potential as part of large-scale transit networks. This requires vehicles to be on-demand, optimally routed, and shared by multiple passengers. Efficient utilization of self-driving vehicles will relieve congestion, improve environmental sustainability, and reduce operating costs for transit providers, while expanding transportation options for consumers.
Continental Invests in AEye
Technology company Continental is further strengthening its LiDAR sensor portfolio through a minority investment in LiDAR pioneer AEye, Inc. LiDAR sensors belong, besides camera and radar, to the key technologies for Automated Driving. Continental has accumulated over 20 years of expertise in LiDAR sensors alone. AEye, located in Dublin, California (USA), has developed a long-range LiDAR technology combining an amplifiable 1550nm laser with patented feedback-controlled Microelectromechanical System (MEMS) scanner. This technology can be configured via software and thus be optimized for manufacturers vehicle and applications. The AEye LiDAR offers maximum leverage for passenger and commercial vehicle applications because it combines a high dynamic spatial resolution with a long-range detection. Vehicles can be detected at a distance of more than 300 meters and pedestrians at a distance of more than 200 meters. AEye’s ability to detect small, low-reflective objects, such as bricks, at a distance of 160 meters with multiple measuring points is pivotal for Automated Driving in both passenger cars and commercial vehicles. Continental will utilize this LiDAR technology and industrialize the sensor to deliver a fully automotive-grade product. The first series production is currently scheduled for the end of 2024.
By partnering with AEye, Continental complements its existing short-range 3D Flash LiDAR technology, which goes into series production later this year, supporting highly automated driving in a global premium vehicle program. This start of production of the High-Resolution 3D Flash LiDAR (HFL) is a key milestone. It is the first high-resolution solid-state LiDAR sensor to go into series production in the automotive market worldwide.
“We now have optimum short-range and world-class long-range LiDAR technologies with their complimentary set of benefits under one roof. This puts us in a strong position to cover the full vehicle environment with state-of-the-art LiDAR sensor technology and to facilitate Automated Driving at SAE levels 3 or higher in both passenger cars and commercial vehicle applications,” said Frank Petznick, head of the Advanced Driver Assistance Systems (ADAS) business unit at Continental.
Blair LaCorte, CEO of AEye Inc., welcomes the Continental investment by saying, “ADAS solutions require a unique mix of performance, scalability, packaging, and a long-term commitment to reliability and safety. Continental is a recognized leader in automotive sensing technology as well as in automotive product industrialization and commercialization. We look forward to working closely with their team to customize our modular and scalable design to deliver Continentals high-performance long-range LiDAR systems to the world’s leading vehicle manufacturers.”
Commercial vehicle application is a touchstone
Automated vehicles capable of AD SAE level 3 or higher require a sensor setup that includes camera, radar and LiDAR to detect objects and usable trajectories around the vehicle. LiDAR sensors offer the strength of robust 3D pixel level detection at high resolution. Continental uses tailored automotive-grade LiDAR technology for both short- and long-range sensing. For the short range, 3D Flash LiDAR technology offers 3D pixel images very quickly and precisely by illuminating and capturing an entire scene in one pulse per frame of data (global shutter technology). For a robust detection of objects at long-range distances, 1550nm agile LiDAR technology provides a proven combination of software configurable HD resolution of over 1600 points per square degree and detection ranges beyond 300 meters. The patented MEMS-based design of AEye’s LiDAR provides tremendous solid-state reliability, while also delivering uncompromising performance under adverse weather and road conditions.
Commercial vehicles with their large mass and longer stopping distance face special challenges to enable safe automated driving. Automation for these vehicles will require a maximum sensor range and resolution to ensure sufficient processing time for automated decisions and actions.
“By bringing leading edge technology together from all three environmental sensor areas, we are creating synergetic effects that will benefit the vehicle manufacturers,” Petznick said. “Continental has vast expertise in all three sensor technologies and in software development.”
StradVision Joins with Renesas’ R’Car
StradVision, whose AI-based camera perception software is a leading innovator in Advanced Driver Assistance Systems (ADAS) and Autonomous Vehicles (AV), has been selected as a member of Renesas’ R-Car Consortium Proactive Partner Program for the second consecutive year. Satisfying the consortium’s criteria of “open”, “innovative”, and “trusted”, StradVision was verified as a high-performance company with a good track record in the automotive market, consequently enabling quick engagement with customers.
“Being selected as a member of the R-Car Consortium Proactive Partner Program is a fantastic opportunity to work even more closely with compatible vendors in the automotive industry, and it’s a very exciting prospect to be part of this platform to speed up the advancement of ADAS technology,” says StradVision CEO Junhwan Kim.
The R-Car Consortium is an open platform environment organized by Renesas, that enables customers to quickly identify and engage with ecosystem partners, whose solutions will help accelerate their innovation for the future mobility market, strengthening research and development for the connected car and ADAS.
StradVision has been working closely with Renesas since the announcement of their collaboration on developing a deep learning-based object recognition solution for smart cameras used in ADAS in September 2019.