In this Article
Waabi Introduces Waabi Driver
Waabi introduced Waabi Driver.
The Waabi Driver combines our revolutionary AI-first autonomy stack as software along with sensors and compute as hardware. Together, they form a complete solution designed for factory-level OEM integration, large-scale commercialization, and safe deployment.
The Waabi Driver is an end-to-end trainable system that automatically learns from data, speeding up development dramatically and enabling it to learn the complex decision-making needed for operating on the road safely. This contrasts the traditional approach that is brittle, overly complex, and requires painstaking manual code adjustments.
The AI-first approach is empowered by Waabi World, the ultimate school for self-driving, which exposes the Waabi Driver to the vast diversity of scenarios needed to hone its driving skills, including both common situations and safety-critical edge cases. To put that into perspective, it would otherwise take thousands of self-driving vehicles driving millions of miles each to experience these situations on the road. This approach drastically reduces the need to drive in the real world, resulting in a solution that is much more sustainable, and both smarter and safer before the wheels even start turning. On-road driving is primarily reserved for the final step of development: validation and verification. This is a major paradigm shift.
While the Waabi Driver is AI-first, that doesn’t mean it is a black box that no one can understand. The autonomy stack is modular and produces intermediate interpretable representations, allowing us to trace and validate every decision it makes.
To fully unleash the power of autonomy, you need a product that scales. The Waabi Driver boasts superior generalization capabilities so it can safely apply learned skills to unseen scenarios and brand new geographies, without ever having driven there before. These capabilities lay the foundation for safe and scalable operations by unlocking new autonomous trucking routes with unprecedented speed.
The Waabi Driver is also purpose-built with production intent from day one. It’s adaptable to multiple redundant truck platforms and easily integrated on the assembly line with no disruption. the hardware solution is plug-and-play, lightweight, simple to maintain, and aerodynamic to maximize fuel savings. This all makes the Waabi Driver the most flexible autonomous trucking solution available.
Waabi employs multiple sensors such as LiDARs, cameras, and radars for increased redundancy and safety. Leveraging the sensor simulation capabilities within Waabi World, Waabi was able to streamline sensor selection, integrate the latest technology quickly, and ultimately deliver the most flexible solution for OEM partners.
Zenseact One Pilot on Volvo EX90
With the release of Volvo Cars’ fully electric flagship SUV – the Volvo EX90 – Zenseact introduces OnePilot: AI-powered software that will offer drivers a new level of safety.
OnePilot marks the start of Zenseact’s journey to autonomous driving, with both ADAS (advanced driver assistance systems) and AD (autonomous driving) functionalities. It’s a significant step in the company’s ambition to reduce traffic accidents and create safe roads for everyone.
The path to safe automation
Zenseact’s commitment to automation – to gradually turn cars into perfect drivers – is essentially based on understanding human behavior. The predictive safety principle, a set of threat assessment, decision-making, and verification practices that enhance traffic safety, frames the technology training the software to anticipate and avoid complex traffic situations, much like humans would.
This is done through continuous iterations where the software is fed data from real-life incidents recorded by cars equipped with OnePilot. The data will be stored and form the basis of machine learning and future updates. As a result, continuous deployment of improved software will make the Volvo EX90 safer over time.
“The launch of the Volvo EX90 featuring OnePilot is a milestone in automotive safety. Through its continuous data gathering, the Volvo EX90 will provide us with data from real-world in combination with simulated scenarios, helping us progress towards zero collisions faster,” Ödgärd Andersson, CEO at Zenseact, says.
A critical feature of OnePilot is the ability to handle driving in the dark; the new lidar-based safety technology in the Volvo EX90 can build a 3d point cloud of the environment around it – equally well in darkness and in light – at up to 250 meters ahead. It can thus detect pedestrians and other objects even when visibility is reduced. So lidar-based safety can offer you the chance to avoid the situation altogether.
Different modes for different moods
Moreover, OnePilot includes three modes of operation: Drive, Cruise, and Ride. ‘Drive’ mode is always on, assisting an active driver, while ‘Cruise’ involves autonomous driving elements under the driver’s supervision. ‘Ride’ mode is a fully autonomous solution installed in the car. Ride is currently at the data capture stage, but when validated and deployed, the Volvo EX90 is the first Volvo that’s hardware- and software-ready for unsupervised driving in the future.
As leading safety innovators, Zenseact and Volvo Cars share the mission of zero collisions on the roads. The Volvo EX90 offers a safer and smoother driving experience.
“Volvo Cars is proud to have Zenseact’s cutting-edge technology in the new Volvo EX90, helping to create an invisible shield of safety for drivers,” says Elsa Eugensson, Senior Program Manager AD & ADAS at Volvo Cars.
Cepton Partners with Exwayz
Cepton, Inc. (“Cepton” or the “Company”) (Nasdaq: CPTN), a Silicon Valley innovator and leader in high-performance lidar solutions, is collaborating with Exwayz to demonstrate new lidar-based perception solutions for mobile robotics applications.
Exwayz provides plug-and-play perception software to enable lidar-based 3D mapping, localization and re-localization as well as object detection and classification. Its complete SDK for real-time 3D lidar processing is used primarily in mobile robotics for logistics, construction, security and more. Lidar solution providers, regardless of their own software capabilities, can integrate Exwayz’s software with their own lidar hardware for an easy demonstration of how lidar enhances intelligence for robotic applications. Cepton’s collaboration with Exwayz adds to Cepton’s existing perception solutions, creating new possibilities to streamline the development and demonstration of lidar-integrated robotic systems across a fast-growing customer base.
Due to its excellent 3D sensing capabilities and ability to operate under challenging lighting conditions, lidar is quickly becoming recognized as an essential sensor technology within robotics. Lidar offers unparalleled accuracy in detecting an object’s distance, size and moving speed. Lidar’s high-precision 3D data not only helps robotic systems navigate safely and autonomously, but also allows operators to easily track their locations while gaining critical insights into how a space is being utilized (e.g., footfall traffic, space occupancy and crowd density). Lidar performs day and night, indoor and outdoor, making it available 24/7. In addition, lidar does not collect biometric data, allowing it to be used in privacy-sensitive venues to accurately classify people from other objects and anonymously track their movement.
Cepton offers a full suite of lidar sensors for automotive and smart infrastructure applications, as well as proprietary perception solutions. Cepton’s collaboration with Exwayz further unlocks the potential utilization of its lidar technology across a wide range of mobile robotics applications. By leveraging the versatility and reliability of Cepton’s high-performance, easy-to-integrate lidars, Cepton and Exwayz aim to provide an immediate solution to expediting solution integration processes for global customers.
DIU Selects Applied Intuition
Applied Intuition, Inc. announced today that it has been selected by the Army and the Defense Innovation Unit (DIU) to deliver an end-to-end autonomy software development and test platform for the Army’s Robotic Combat Vehicle (RCV program). The $49 million contract ceiling for the competitive prototyping phase will span 24 months.
Applied Intuition will provide a foundational modeling and simulation platform that will enable the RCV program office, under the umbrella of PEO Ground Combat Systems, to manage the development and test of software for mission and mobility autonomy for the RCV variants.
Applied’s end-to-end autonomy development solution will enable the RCV program to meet requirements related to off-road maneuvering, obstacle avoidance, and safety. Applied’s toolchain will help the RCV program evaluate autonomy stacks developed by the Army and its other commercial partners.
“We are excited to bring our proven enterprise autonomy development toolchain to the Army’s RCV program,” said Qasar Younis, Co-Founder and CEO of Applied Intuition. “Our modeling and simulation development environment will enable continuous improvement of autonomy software across the program’s lifecycle and will ultimately enhance the Army’s broader approach to autonomy stack development.”
The award is the result of an innovative contracting mechanism, DIU’s Commercial Solutions Opening, where the Army’s RCV program worked in close coordination with DIU to acquire commercial software as a part of the Software Pathway under the Agile Acquisition Framework.
“The innovative use of the Department of Defense’s (DOD) Software Acquisition Pathway to acquire commercial modeling and simulation software for autonomy development is a landmark achievement,” said Colin Carroll, Head of Government at Applied Intuition. “We look forward to helping the RCV program and the DOD quickly and safely scale production of autonomous systems.”
Waymo Autonomous Driverless Weather Maps
Waymo’s latest hardware — complete with cameras, radar, and lidar — uses the raindrops on its windows — or lack thereof— to classify various weather conditions. Researchers have used laser-based instruments for years to measure the properties of clouds, fog, dust storms, snow, and rainfall. By employing a similar approach to the data collected by our core sensing suite and combining it with high-quality ground-truth data from weather visibility sensors, we’ve generated a quantitative metric about meteorological visibility.
The metric enables the Waymo Driver to conduct a quantitative analysis of the weather around the vehicle in real-time to make sense of whether it’s foggy, raining, or something else altogether and distinguish the intensity of said weather condition. Each Waymo vehicle operates as an autonomous “mobile weather station”, providing an unprecedented understanding of weather in the areas we drive.
Introducing a first-of-its-kind weather map
Today’s weather tools and datasets often lack the precision and specificity to reflect the conditions where our vehicles are driving. For example, weather stations – which are generally considered the best source for real-time weather information – are typically located at airports to support aviation safety and climate monitoring applications. But even over short distances – such as between San Francisco International Airport and the Sunset district – local conditions can vary significantly.
While remotely sensed weather data from satellites or weather radars help fill in the gaps from local weather stations, they also do not directly sense weather conditions near the surface, especially when clouds are in the way.
By employing the fleet of autonomously driven vehicles as mobile weather stations and combining millions of data points about the weather across time and space, Waymo developed – – a first-of-its-kind fog map that provides an unparalleled amount of spatial-temporal resolution and helps us better inform operations. With these insights, the autonomous fleet can track the progression of coastal fogs as they flow in from the Pacific Ocean and burn off as the sun rises later in the morning. It can even detect drizzle and light rains that lead to wet roads in situations that are invisible to the National Weather Service’s local Doppler weather radar. These weather observation capabilities allow Waymo to localize where the weather conditions are beginning to deteriorate or improve. Waymo is using these now to enable our ride-hailing services in San Francisco and Phoenix, and we’ll create similar weather maps for additional cities as we scale.
Nexar Releases CityStream
Nexar leading AI computer vision company, announced today the release of CityStream™ Live, the company’s Real-Time Mapping (RTM) platform. As the first of its kind solution, CityStream™ Live enables the mobility industry, including connected vehicles, maps, mobility services, digital twins or smart city applications, to access a continuous stream of fresh, crowdsourced road data. Only with real time data can vehicles really know what’s coming their way, react to varying speed limits, avoid work zones, find parking and someday drive themselves. Thanks to Nexar’s massive network of “eyes on the road”, edge AI and change detection capabilities, CityStream™ Live is already available to industry design partners.
As software is “eating” vehicles and roads, today’s digital maps fall short of the freshness and precision that software driven auto OEMs, autonomous vehicles, and mobility players require. Standard mapping methods – such as SD, HD, and traffic maps – fail to provide accurate, up-to-date, and cost effective solutions. By introducing new and disruptive RTM technologies at the edge of the network, Nexar’s CityStream™ Live is transforming how road information is captured and delivered to the mobility ecosystem. In addition to the company’s applications for detecting work zones, road sign changes, potholes, and free parking spaces, this new platform will provide unprecedented levels of freshness on nearly every road across the US at a dramatically reduced cost.
Utilizing a crowdsourcing network and edge AI software, CityStream™ Live offers users and developers a live data feed to increase situational awareness, enhance driving capabilities, increase safety, add comfort and help solve everyday mobility challenges. With over 700,000 vehicles in Nexar’s network of cameras capturing 94% of US roads each month, Nexar collects 3 billion miles of road vision data per year. By combining massive data aggregation with on-the-fly data curation, CityStream™ Live is the first platform to deliver road data streams in real time and at scale, supporting an unprecedented number of urban and highway use cases.