Autonomous & Self-Driving Vehicle News: Optimus Ride, Polaris, Trimble, AEye, ON Semi,  Woven Planet & Aurora

In autonomous and self-driving news are Optimus Ride, Polaris, Trimble, AEye, ON Semi,  Woven Planet and Aurora.

Optimus Ride & Polaris

Optimus Ride, a leading autonomous mobility service provider, and Polaris Inc.  announced a partnership to bring fully autonomous GEM vehicles to market. Under this joint development agreement, an exclusive line of Polaris GEM electric low-speed vehicles (LSVs) will be manufactured to fully integrate Optimus Ride’s autonomous software and hardware suite direct from the factory for deployment nationwide on streets in residential communities, corporate and academic campuses, and other localized environments. This partnership builds upon an investment Polaris made last year in Optimus Ride. Combining Optimus Ride’s cutting-edge, full-stack autonomous vehicle technology with Polaris’ ability to quickly develop and scale production of these state-of-the-art vehicles represents an important milestone in the deployment of driverless, remotely monitored autonomous vehicles.

Together, Optimus Ride and Polaris Commercial, a division of Polaris, have set out to solve a real mobility problem through autonomous vehicle technology. In many environments, cars, vans and buses play inefficient roles in mobility, creating unnecessary pollutants, costs and safety concerns. Right-sized, all-electric, fully autonomous GEMs are a safe, cost effective and sustainable solution. In fact, Optimus Ride has already successfully completed more than 75,000 rides with Polaris GEM vehicles over the last two years across deployment sites throughout the country, demonstrating significantly lower transportation investment, enhancing the rider experience and doing it with established LSV safety benefits.

Trimble Intros Trimble MX50

Trimble (NASDAQ: TRMB) introduced the Trimble® MX50 mobile mapping system for asset management and mapping—a new addition to its established mobile mapping portfolio. This vehicle-mounted mobile LiDAR system is a mid-range option for first-time mobile mapping users and experienced providers to expand their equipment fleet with precise, high-volume data capture technology that works in conjunction with Trimble’s geospatial software solutions.

By providing clean and accurate data of ground surfaces, the Trimble MX50 is a practical choice for highway and road inspection and maintenance organizations; city, state and local governments; public utilities; contractors; and survey companies wanting to expand their service capabilities.

The Trimble MX50 features new Trimble-designed profiling lasers for high-accuracy data collection, a 360-degree panoramic camera and a GNSS/IMU positioning system from Applanix, a Trimble Company. The system produces dense point clouds and immersive imagery for surveying and mapping accuracy, and works with Applanix POSPac, Trimble Business Center and the Trimble MX software suite. The Trimble MX50 also expands the company’s mobile mapping portfolio, which includes the widely adopted Trimble MX9 system for large scanning and mapping missions and the highly portable Trimble MX7 for capturing precisely positioned street-level imagery./

AEye News

1000 Meters in Rain

AEye, Inc., (“AEye”) the global leader in adaptive, high-performance LiDAR solutions, announced its LiDAR sensor – already known for extreme long-range capabilities – has achieved yet another milestone: 1000 meter range in rain, behind windshield glass. The test was performed at the American Center for Mobility (ACM) test track in Ypsilanti, Michigan, with results verified by active safety and automated vehicle technologies researcher, VSI Labs.

AEye LiDAR Out-Performs in Rain, Behind the Windshield
AEye’s sensor has already been independently verified to have twice the range as the nearest LiDAR competitor. The new test shows that – not only does AEye’s adaptive LiDAR achieve groundbreaking range capabilities, it does so in adverse weather conditions, and behind a first surface: in this case, a windshield.

The test was conducted using VSI’s research vehicle, which integrated AEye’s sensor into its AV stack to study the impact of adaptive LiDAR on the performance and safety of automated functionality. The team used a rain machine to simulate wet weather, and mounted the sensor behind a piece of windshield glass to gauge long-range sensor performance in heavy rain. You can see the video here.

“The ultra-long-range capabilities of our adaptive LiDAR enables OEMs to release new revenue-generating applications like highway autopilot or hub-to-hub autonomous trucking,” said Jordan Greene, GM of ADAS and VP of Corporate Development at AEye. “Being able to deliver this performance in all weather conditions ensures these applications can be safely implemented in even the toughest driving environments.”

“Having already verified AEye’s extreme long-range detection, this was an important follow-up test to ensure that 1000 meter performance would stand up in less than ideal weather conditions, and when mounted behind the glass of a windshield,” said Phil Magney, founder and president at VSI Labs. “We were impressed with the sensor’s performance on both counts – which certainly bodes well for OEMs looking to implement reliable, high-performance LiDAR.”

AEye Detects Pedestrians, Through the Rain, in the Dark
In a second test of its LiDAR at the ACM track, the AEye sensor mounted on VSI’s test vehicle detected small objects in a tunnel, through rain and a second surface, at 120 meters. This test was conducted amid heavy rain, with the sensor peering into a dark tunnel. The AEye sensor detected five bricks and a black dog not visible to the human eye at 120 meters, as well as a pedestrian and child at 110 meters.

“I’ve never seen a demo like that one before – in a real-world scenario under poor weather, behind the windshield, while still being able to achieve the distance and detection. What we saw was really impressive,” said Sam Abuelsamid, principal research analyst at Guidehouse Insights.

In March, AEye announced VSI Labs verified AEye LiDAR’s breakthrough range, resolution and speed capabilities, as well as its ability to place the sensor behind first surfaces, such as the windshield or grill, with minimal performance impact. The latter is critical to automotive OEMs, as it provides OEMs flexibility in implementing sensors within their designs, without compromising aesthetics or changing the aerodynamics of the vehicle.

This design-centric vehicle integration is made possible by AEye’s unique bistatic architecture, which separates the transmit and receive paths, providing optical isolation that – unlike traditional coaxial LiDAR systems – ensures any light reflected back doesn’t blind the sensor. The architecture also ensures optimal performance, even in the most adverse weather conditions. This performance is further enhanced by AEye’s use of 1550 nanometer lasers, whose longer wavelength better penetrates obscurants, providing superior detection in rain, snow, and smoke.

AEye’s intelligent LiDAR uses adaptive sensing to deliver this industry-leading performance, which addresses the most difficult challenges facing autonomous driving, while meeting automotive functional safety requirements. Unlike traditional sensing systems, which passively collect data, AEye’s adaptive LiDAR scans the entire scene, while intelligently focusing on what matters in order to enable safer, smarter, and faster decisions in complex scenarios. As a result, AEye’s LiDAR uniquely enables higher levels of autonomous functionality (SAE L2-L5) at the optimal performance, power, and price.

AEye is the first and only LiDAR provider to have its performance independently verified and published by reputable third-party testing organizations. In addition to performance testing by VSI Labs, the ruggedness and reliability of the sensors has been validated by global product test, inspection, and certification leader NTS, which put the sensors through extreme automotive shock and vibration tests. More information on the tests by VSI Labs and NTS are publicly available on AEye’s website.

AEye Accelerated

AEye, Inc. (“AEye”), the global leader in adaptive, high-performance LiDAR solutions, today announced it has accelerated the rollout of its business model across automotive, industrial and mobility markets. In automotive, Continental announced it has integrated AEye’s long-range LiDAR technology into its full stack Automated Driving platform, and is industrializing the technology for a planned start of volume production in 2024, while AEye has announced that they have selected Sanmina to begin production of AEye’s 4Sight M LiDAR sensor for industrial and mobility markets this September.

AEye is the only LiDAR company to use a unique licensing model, combined with a single platform and supply chain for all markets.

 

“TuSimple has the world’s most advanced autonomous driving system, with the industry’s best long-range perception,” said Chuck Price, Chief Product Officer at TuSimple. “AEye’s adaptive LiDAR complements our solution, with its ultra-long range, high performance enabling object acquisition and avoidance capabilities at highway speeds that are imperative for safe autonomous trucking implementations. AEye’s software-configurable hardware enables us to utilize a single sensor for both low speed, wide Field-of-View cut-ins and high speed, long-range, small object detection – flexibility that is incredibly powerful for addressing the wide scope of trucking corner cases.”

AEye LiDAR enables the highest levels of safety for all vehicles – including heavy-duty trucks. Its 1000 meter range, as well as its resolution and speed, have been independently verified as industry-leading by automated vehicle technology evaluator, VSI Labs. AEye’s LiDAR delivers more than twice the range and over two times the resolution of any long-range LiDAR. Its automotive-grade reliability, solid-state MEMS scanner performance, and its unique ability, via the system’s 1550nm adaptive LiDAR, to effectively scan through rain and other obscurants, makes it ideal for trucking. Additionally, it is the only software-configurable LiDAR, enabling customers to use the same LiDAR hardware optimized for each application, ranging from complex merging to hub-to-hub highway automation.

“We are pleased to partner with TuSimple in advancing reliable highway autonomy for long-haul trucks,” said Blair LaCorte, CEO at AEye. “AEye’s high performance LiDAR system enables detection of road debris, pedestrians, vehicles, and more at extremely long ranges, with software-configurable control that enables TuSimple’s full stack to navigate complex environments at speed. We look forward to a rich partnership with TuSimple, deploying the next generation of autonomous solutions for the trucking industry.”

AEye’s iDAR™ is a proprietary, intelligent, low-cost LiDAR that uses adaptive sensing to deliver industry-leading performance and address the most difficult challenges facing autonomous driving while meeting automotive functional safety requirements. Traditional sensing systems passively collect data. AEye’s adaptive LiDAR scans the entire scene, while intelligently focusing on what matters in order to enable safer, smarter, and faster decisions in complex scenarios. As a result, AEye’s LiDAR uniquely enables higher levels of autonomous functionality (SAE L2-L5) at the optimal performance, power, and price.

AEye and NVIDIA

AEye, Inc. (“AEye”), the global leader in adaptive, high-performance LiDAR solutions, announced it is working with NVIDIA to bring its adaptive, intelligent sensing to the NVIDIA DRIVE® autonomous vehicle platform.

The NVIDIA DRIVE platform is an open, end-to-end solution for Level 2+ automated driving to Level 5 fully autonomous driving. With AEye’s intelligent, adaptive LiDAR supported on the NVIDIA DRIVE platform, autonomous vehicle developers will have access to next-generation tools to increase the saliency and quality of data collected as they build and deploy state-of-the-art ADAS and AV applications. Specifically, AEye’s SDK and Visualizer will allow developers to configure the sensor and view point clouds on the platform.

AEye and Sanmina

AEye, Inc. (“AEye”), the global leader in adaptive, high-performance LiDAR solutions, today announced Sanmina Corporation, a leading integrated manufacturing solutions company that manufactures some of the world’s most complex and innovative electronic, optical and mechanical products, will begin production of AEye’s 4Sight M LiDAR sensor for industrial and mobility markets in September. The transfer from AEye’s pilot line in Dublin, California to Sanmina’s commercial production lines will take place over the next few months, as the company prepares for volume production.

ON Semi for AutoX Gen5

ON Semiconductor (Nasdaq: ON), today announced that its image sensing and LiDAR technologies power key functions of AutoX’s Gen5 self-driving platform. Revealed at the World Artificial Intelligence Conference, the new Gen5 autonomous vehicle technology enables the first fully driverless RoboTaxi, designed to democratize autonomy and provide universal access to the transportation of people and goods.

“In our quest to bring our Level 4 autonomous RoboTaxi to the market, ON Semiconductor is the obvious partner for all of our sensing needs,” stated Jianxiong Xiao, founder and CEO of AutoX. “The AR0820AT 8 MP image sensor enables high-resolution camera fusion with other sensors. This is crucial in dense urban scenarios, where a wide field of view is needed to capture objects on sidewalks or cross traffic, while extending the practical sensing distance to beyond 300 meters to enable autonomy at freeway speeds, where objects or signs must be recognized farther away from the vehicle to enable sufficient reaction time.”

AutoX RoboTaxis are equipped with the most advanced camera sensors and LiDAR detectors for the highest level of safety. ON Semiconductor provides 28 high-resolution AR0820AT 8 MP image sensors and four SiPM arrays for LiDAR sensors, providing a full surround view with zero blind spots.

“ON Semiconductor continues to drive innovations in market-leading sensor technologies, and our scalable sensor solutions address the stringent and rapidly evolving needs of the automotive market,” commented Ross Jatou, senior vice president, Intelligent Sensing Group at ON Semiconductor. “Performance and deep integration are both key for truly driverless applications. We are thrilled with the continued engagement with AutoX, as we continue to advance active safety and enable fully autonomous driving.”

AutoX’s complete hardware and software stack for Level 4 autonomous driving can handle the densest and most dynamic traffic conditions, demonstrated by its recognition as the first company to receive a license to operate fully autonomous RoboTaxis in China. AutoX has deployed hundreds of RoboTaxis in Shanghai, Shenzhen, Wuhan and other major Chinese cities. The company also launched its RoboTaxi and RoboDelivery pilot services in California last year.

ON Semiconductor is a leader in automotive sensing with over 400 million image sensors on the road today. Leveraging a legacy of imaging excellence spanning over forty-five years, the company supplies a variety of sensor types, resolutions and optical formats for the most demanding imaging applications. The company’s sensor portfolio includes advanced solutions for park assist, surround/rear view cameras, in-cabin, mirror replacement, lane departure warning, advanced braking, collision avoidance and other ADAS/AD systems.

On the path towards autonomous driving, system integration capabilities are critical to the development. Only ten months after joining forces in partnership with LiDAR expert AEye, Continental is integrating the long-range LiDAR technology into its full sensor stack solution to create the first full stack automotive-grade system for Level 2+ up to Level 4 automated and autonomous driving applications. The solution, based on AEye’s LiDAR technology, is a substantial part of the sensor setup for high level automation systems. It complements the radar, camera and ultrasonic technologies in Continental’s sensor system, and enables a reliable and redundant Automated Driving platform that can handle complex, diverse traffic scenarios and adverse weather conditions.

Complementary technology approach enables high level automation on the road

“Reliable and safe automated and autonomous driving functions will not be feasible without bringing the strengths of all sensor technologies together,” said Frank Petznick, head of Continental’s ADAS Business Unit. Complex and safety critical traffic scenarios, such as obstacles on the road and fast vehicles passing on highways, require high automation systems to have a maximum sensing range and image resolution to ensure sufficient response time. A single technology approach cannot fulfill this requirement. Continental’s complementary approach also ensures that the system can operate in all environmental conditions, including low sun, heavy rain, dense fog and cold or hot temperatures.

From start-up technology to affordable solutions for mass market applications

One of the most challenging parts of developing new technologies is to make them accessible to the mass market. Continental is now industrializing AEye’s reference technology for mass market production, which is critical to ensuring consistent quality that vehicle manufactures can rely on.

“Our partnership with AEye is unique because it enables Continental to build a new long-range LiDAR in a very short time, based on AEye’s reference architecture and software,” said Dr. Gunnar Juergens, Head of LiDAR Segment at Continental. “We will manage the entire product life cycle, including the development of a mass market product, as well as manufacturing, validation and testing according to automotive-grade standards.”

The industrialization of new technologies is not new to Continental. The technology company has a long track record of making new cutting-edge technologies accessible for vehicle manufacturers around the globe. In 1999, Continental industrialized radar technology and was able to realize an affordable price point that helped bring important safety functions like emergency brake assist to market. Later this year, based on more than 20 years of experience in Automotive LiDAR, Continental is launching the first solid-state short-range High Resolution 3D Flash LiDAR to the market.

“With AEye’s adaptive LiDAR technology, OEMs can optimize the field of view and resolution of the sensor for different use cases with software,” Petznick said. “This is a natural fit with our configurable full stack platform that seamlessly integrates cameras, radars, and LiDARs with our Automated Driving Control Units and software solutions to give light and commercial vehicle manufacturers the ability to easily select and combine the ADAS features they want within a wide breadth of vehicle models.”

Sample production and ramp-up of series production

As a result of the partnership, Continental is already producing first samples of the new long-range LiDAR in its Ingolstadt plant in Germany. With these samples, Continental can drive further industrialization with a planned start of production of 2024. In parallel, Continental has started production line build-up preparations to ensure a smooth and timely transition from sample to series production.

Continental develops key components for assisted and automated driving all over the world. Last month, the company celebrated the start of construction on a new, state-of-the-art manufacturing facility in New Braunfels, Texas. As the demand for intelligent safety functions continues to increase, Continental is expanding its R&D and manufacturing capabilities to meet the growing demands.

Continental develops pioneering technologies and services for sustainable and connected mobility of people and their goods. Founded in 1871, the technology company offers safe, efficient, intelligent, and affordable solutions for vehicles, machines, traffic and transportation. Continental generated preliminary sales of €37.7 billion in 2020 and currently employs more than 235,000 people in 58 countries and markets. In 2021, the company celebrates its 150th anniversary.

AEye, Inc. (“AEye”), the global leader in adaptive, high-performance LiDAR solutions, today announced a development partnership with global self-driving technology company, TuSimple. TuSimple is working with VW’s TRATON Group to develop a commercial-ready fully autonomous system for heavy-duty trucks, and is co-developing Level 4 self-driving trucks with Navistar, targeting production in 2024. TuSimple selected AEye as a development partner based on AEye LiDAR’s extreme long-range performance, impressive weather capabilities, and its ability to address the most challenging autonomous trucking situations.

The partially and fully autonomous truck market is expected to reach approximately $88 billion by 2027, growing at a compound annual growth rate of greater than 10% between 2020 and 2027, according to data from Acumen Research and Consulting.

Woven Planet Acquires CARMERA

-Woven Planet Holdings, Inc. (“Woven Planet”), a subsidiary of the Toyota Motor Corporation, announced the acquisition of CARMERA, Inc. (“CARMERA”), a U.S.-based spatial AI company, which specializes in bringing next-generation road intelligence to automated mobility at scale. This is the second major deal for Woven Planet in North America, following the April 2021 announcement to acquire Level 5, the self-driving division of Lyft.

Once the deal is closed, the CARMERA team will report into the Automated Mapping Platform (“AMP”) organization of Woven Alpha, Inc. (“Woven Alpha”). Woven Alpha focuses on exploring new strategic areas for business expansion and incubates several innovative projects such as Woven City and Arene, which is Woven Planet’s open software platform. AMP is a connected crowdsourced software platform that supports the creation, development and distribution of high definition (“HD”) maps—a key enabler for smart and safe automated mobility.

The Woven Alpha team plans to develop AMP to become the most globally comprehensive road and lane network HD map platform, enabling high-precision localization support to automated vehicles. The acquisition of CARMERA will accelerate AMP’s shift from the R&D stage to the next phase of commercialization by bolstering the platform’s engineering team with top experts in the development of HD maps. In addition, it will provide access to CARMERA’s sophisticated map update, change management and IoT sensing technology.

Together, the teams will tap into CARMERA’s ability to successfully update HD maps from crowdsourced, camera-based inputs—a significantly cheaper and faster approach than traditional methods. This will strengthen AMP’s ability to serve a comprehensive set of road classes and features, reflecting changes in lane markings, traffic signals, signs and more in near real-time, and support its future multi-regional commercial launch.

CARMERA will join Woven Planet Group as a wholly-owned subsidiary, expanding the company’s footprint beyond its Tokyo headquarters by adding New York and Seattle offices to its planned offices in Silicon Valley and London.

Veoldyne Joins NVIDIA Metropolis

Velodyne Lidar, Inc. (Nasdaq: VLDR, VLDRW)  announced it has joined the NVIDIA Metropolis program for Velodyne’s Intelligent Infrastructure Solution for traffic monitoring and analytics. NVIDIA Metropolis is designed to nurture and bring to market a new generation of applications and solutions that make the world’s most important spaces and operations safer and more efficient with advancements in AI vision.

The Intelligent Infrastructure Solution combines Velodyne’s award-winning lidar sensors and powerful AI software to monitor traffic networks and public spaces. The solution addresses the pressing need for smart city systems that can help improve road safety and prevent traffic accidents. Government data showed 2020 was the deadliest year for U.S. traffic crashes in over a decade, with a 7.2 percent increase in traffic fatalities over the previous year.

Velodyne’s solution leverages the powerful capabilities of the embedded NVIDIA Jetson AGX Xavier module in its edge AI computing system to run the solution’s proprietary 3D perception software, which can detect all road users including vehicles, pedestrians and cyclists in real time. The NVIDIA Jetson AGX Xavier provides unparalleled computing power of a GPU workstation in a compact, energy-efficient module. It powers the Intelligent Infrastructure Solution AI application to run up to 50 frames per second and process lidar frames in real time to detect, classify and extract traffic trajectory.

Being a part of NVIDIA Metropolis, Velodyne gains increased exposure to industry experts, AI-driven organizations, governments and integration partners looking to leverage world-class AI-enabled solutions to improve critical operational efficiency and safety problems. Velodyne also has early access to NVIDIA platform updates and can tap into NVIDIA’s diverse partner ecosystem to support lidar-based solution development.

New Deployment in Austin

Velodyne’s Intelligent Infrastructure Solution pilot project is set to deploy in Austin, Texas. The city will use the solution to assess traffic conditions and identify proactive safety measures that can be taken to help save lives.

The project is using lidar-based traffic monitoring as a reliable, non-intrusive multi-modal replacement for inductive-loop detectors, cameras, and radars. Velodyne’s Intelligent Infrastructure Solution creates a real-time 3D map of roads and intersections to generate traffic data in a wide variety of lighting and weather conditions, and provide categorized monitoring data on pedestrians, cyclists, cars, and trucks.

In Austin, the first installation will be at East 7th Street & Springdale Road. This intersection has been identified as needing improvement due to the accident history, fatality risk, speeding prevalence and congestion.

“Lidar-based solutions hold tremendous potential to protect personal identifiable information while continuing to help us to achieve our Vision Zero goal to eliminate traffic deaths and serious injuries on Austin streets,” said Jason JonMichael, Assistant Director, Austin Transportation Department. “Velodyne Lidar’s Intelligent Infrastructure Solution could enable us to more efficiently collect and analyze the mobility that is needed to improve roadway efficiency and safety.”

“It is great to be working with the city of Austin on innovation initiatives that will certainly transform roadways into smarter, safer infrastructure for the community,” said Jon Barad, Vice President of Business Development, Velodyne Lidar. “Partnering with NVIDIA Metropolis allows us to help Austin and other communities address road safety challenges and improve traffic efficiency and sustainability.”//

Aurora SPAC

Aurora, the self-driving technology company, has entered into a definitive business combination agreement with Reinvent Technology Partners Y (“Reinvent”) (NASDAQ: RTPY), a special purpose acquisition company with the sponsor team that takes a “venture capital at scale” approach to investing. Upon closing of the proposed transaction, the combined company will be named Aurora Innovation, Inc. and be publicly traded, with its common stock expected to be listed on Nasdaq with the ticker symbol AUR.

Investors and Aurora partners have committed $1 billion in a PIPE and the proposed transaction represents an equity value of $11 billion for Aurora. Investors in the PIPE include Baillie Gifford, funds and accounts managed by Counterpoint Global (Morgan Stanley), funds and accounts advised by T. Rowe Price Associates, Inc., PRIMECAP Management Company, Reinvent Capital, XN, Fidelity Management and Research LLC, Canada Pension Plan Investment Board, Index Ventures, and Sequoia Capital, as well as strategic investments from Uber, PACCAR, and Volvo Group.

Aurora’s truck manufacturing partners, Volvo Group (which includes Volvo Autonomous Solutions) and PACCAR (which includes the Peterbilt and Kenworth brands) collectively represent approximately 50 percent of the Class 8 trucks sold in the U.S. market. As long-term committed partners, Volvo and PACCAR will help accelerate the development, validation, and deployment of self-driving trucks. Aurora is also expected to scale rapidly in passenger mobility with the support of Toyota, the world’s #1 OEM supplier, and Uber, the largest ride-hailing network globally by market-cap.

Existing Aurora stockholders are expected to own approximately 84 percent of the pro forma combined company following the close of the proposed transaction.

Additional information about the proposed transaction, including a copy of the merger agreement and investor presentation, has been provided in a Current Report on Form 8-K filed by Reinvent today with the Securities and Exchange Commission (SEC) and available at www.sec.gov. In addition, Reinvent intends to file a registration statement on Form S-4 with the SEC, which will include a proxy statement/prospectus, and will file other documents regarding the proposed transaction with the SEC.

Radar-Transparent Film from AkzoNobel

The radar-transparent bright film – perfected by working together with customers – means vehicle makers no longer have to worry about hiding sensors behind solid metal, which can block the signals of safety features such as anti-collision warnings.

AkzoNobel is an approved supplier of film products for this particular application, and the company was recently specified as a global supplier of emblems by one of the world’s biggest car brands.

“Vehicle requirements are changing all the time and we’re very happy to have solved this difficult problem with an intelligent film coating that allows radar signals to pass through,” explains Patrick Bourguignon, Director of AkzoNobel’s Automotive and Specialty Coatings business.

“It was a highly technical process, which involved close collaboration with customers to establish correlations between film properties and radar transmission,” he continues. “On the surface, it provides an attractive, mirror-like finish. What you can’t see is the breakthrough technology we’ve developed, which allows better and more consistent transmission of signals that ultimately help people to drive more safely.”

The products supplied by the company’s Film business are mainly used by the automotive and aerospace industries, as well as for signage. They include coated films for substrate protection and decoration, and markings/decals used for safety and decoration.

“Innovation not only drives our business, it also contributes to the success of our customers,” says Bourguignon, who adds that other vehicle manufacturers are keen to include the bright film technology in their designs. “We’ll continue to channel our pioneering spirit into working together with our partners to develop advanced solutions and meet any challenges that the future may bring.”