At CES, automakers demonstrated concepts and technology for their cars of the future each taking different approaches to automotive design and autonomy. The various concept and autonomous cars involve personalization, health, home connections, new mobility uses, biometrics, en-route delivery of packages, scent, streaming entertainment or energy to homes and calls for help from a human in the control center.
The Las Vegas Convention Center and parking lots were over run with swarms of humans, viewing and trying out the latest technology in the automotive industry. Here are the top new innovations at CES from automakers with videos and descriptions.
The way the automakers are going about autonomous driving differs. Nissan and Toyota offer a graduated approach. Toyota will havea Guardian systems while Nissan will offer the ability to call for a command center for help navigating difficult situations. The concept cars focus on personalization and uniquely recognizing the driver either through cameras or other biometrics.
Services offered include music, shopping, package delivery, aroma therapy, personal assistance, power storage, convertible spaces and exterior messaging.
Toyota
Concept-I
Toyota takes a friendly approach to its car of the future based on how people warmly relate to themselves and th world around them. It lacks the massive screens of other concept vehicles providing a simpler concept.
Designed by Toyota’s CALTY Design Research in and with user experience technology development from the Toyota Innovation Hub in San Francisco, the Concept-i is based on “kinetic warmth,” with warm technology that is welcoming fun. Concept-i leverages the power of an advanced artificial intelligence (AI) system to anticipate people’s needs, inspire their imaginations and improve their lives.
AI that learns with the driver to build a relationship that is meaningful and human, measuring emotions to improve quality of life.
Automated vehicle technologies help enhance driving safety, combined with visual and haptic stimuli to augment communication based on driver responsiveness. While under certain conditions users will have the choice of automated or manual driving based on their personal preference, Concept-i monitors driver attention and road conditions, with the goal of increasing automated driving support as necessary to buttress driver engagement or to help navigate dangerous driving conditions.
In fact, Concept-i avoids screens on the central console to reveal information when and where it’s needed. Colored lights in the foot wells indicate whether the vehicle is in automated or manual drive; discrete projectors in the rear deck project views onto the seat pillar to help warn about blind spots, and a next-generation head up helps keep the driver’s eyes and attention on the road.
Toyota is nominated for a Tech CARS Award, we’d appreciate your vote.
Messages from Yui appear on the exterior door panels to greet driver and passengers as they approach the vehicle. The rear of the vehicle shows messages to communicate about upcoming turns or warn about a potential hazard. The front of the vehicle communicates whether the Concept-i is in automated or manual drive.
Autonomous Plans: Guardian & Chaufer
In the case of self-driving, Toyota Research Institute’s Dr Gil Pratt notes that there are problems with partial autonomous driving. The driver often loses interest and either trusts the car too much or too little. Toyota is working on the a Guardian system and level 2-5 in system called Chaufer. Guardian only engages when needed. Chaufer is engaged all the time.
To show that Toyota is forward thinking they announced during the announcement that there will be discussing it on Reddit.
Hyundai
Concepts
‘Health + Mobility Cockpit’ is a concept enabled by the monitoring of health and related indicators to help manage the stress and other negative effects of driving. Sensors throughout the car could monitor the physical and mental state of the driver, detecting everything from the driver’s posture to their respiratory rate and breathing depth. Also the car could measure heart-rate variability for stress response, and use eye tracking and facial feature recognition to track alertness and emotional state. To promote mental awareness and focus, the car could then respond by delivering a customized multi-sensory experience.
- Posture – In response to data that suggests the driver is losing focus or alertness, the driver’s seat can automatically adjust to a more upright position. Alternatively, if sensors detect driver discomfort or agitation, activate pneumatic lumbar systems can massage the driver’s lower back to promote relaxation.
- Scent – Combining a range of different scents can elicit a variety of driver responses as deemed necessary by the health and wellness monitoring systems.
- Light – In the same way as dawn stimulates the senses to wake the body, varying levels of warm and cool lighting can spread across the dashboard to impact alertness and mood.
- Temperature – The Healthcare Cockpit can sense the ambient temperature of the car and direct cooler or warmer air towards the driver to alter responsiveness or enhance comfort.
- Sound – The car’s music and radio applications are able to sync with the Healthcare Cockpit’s sensors to create relaxed or dynamic environments.
The ‘Mobility Vision’ concept lets the car drive, connect to homes, power homes and acts as mini mobile home or home accessory. The car integrates itself with the living space. It can be an extra couch, entertainment area or van work as an HVAC on an existing house. When ‘docked’ with the Smart Home, Hyundai Motor’s mobility concept becomes an integral part of the living space, performing useful functions and enhancing the living environment. For example, the mobility concept can act as an air conditioner; share its entertainment facilities by mirroring audio and visual outputs with the home’s smart devices; and even provide power in emergency situations, using its on-board fuel cell as a generator.
Autonomous Driving Hyundai
AUTO Connected Car News went for a short ride in the Hyundai Ioniq autonomous car. It drives similarly to other autonomous driving cars.
The Autonomous IONIQ uses LiDAR to navigate. The LiDAR is hidden behind the autonomous IONIQ’s front bumper. The car’s advanced self-driving systems are kept as simple as possible by integrating existing functions from the production model, including the Smart Cruise Control system’s forward-facing radar and Lane Keeping Assist cameras.
The system also uses a GPS antenna and Blind Spot Detection radar. Hyundai Motor is continuing to develop and refine its self-driving technologies to use less computing power to create a lower cost system that consumers can afford.i
The Hyundai Ioniq is nominate for AUTO Connected Car News’ Tech CAR Awards, you are welcome to vote for your favorite.
Chrysler
Chrysler Portal Concept
The Chrysler Portal concept is for Millennials designed by younger car designers to meet the future needs of younger car buyers
Millennials are tech savvy, environmentally aware and cost conscious. The Chrysler Portal concept is designed for flexiable space, social media communities and seamles mobile devices integration as well as for the cars that can handle the transportation of children.
The vehicle that can be upgraded as their lives change. The open space cabin can be changed and battery-electric vehicle technology contributes to cabin spaciousnes. The lighting on the Chrysler Portal concept is an interactive experience, as well as a communication tool. The vehicle is equipped with full-color, changing LED lighting on the front, side portals and rear. Not only can the light take on different colors, it can have a swiping or animated appearance. Interactive ground projection and portal lighting are available in infinite colors that can be tailored for personal, business or drive settings, such as when the vehicle is parked, locked/unlocked or in autonomous mode.
The headlamps and tail lamps feature next-generation Thin Lens LED technology with an adaptive driving beam to provide increased safety through improved visibility. A full-length clear polycarbonate roof panel visually expands the vehicle’s interior space and admits natural light to all occupants.
The Chrysler Pacifica has many new features and is nominated for a Tech CARS Award, vote for your favorite.
Technology is a major part of the concept–Infotainment, sensor and software systems are designed to provide a seamless user experience, including these key features:
- Facial recognition and voice biometrics recognize the user and are able to customize individual or family settings to provide a unique drive experience based on preferred features, such as exterior and interior lighting, favorite music, enhanced audio settings, favored destinations and more
- Vehicle-to-X (V2X) communication enables the vehicle and infrastructure to “talk” to each other, such as intersection crash warning, traffic sign recognition and emergency vehicle approaching
- Personal Zoned Audio keeps the driver aware of surroundings by enhancing sound and directionality in the event of approaching emergency vehicles
- Seamless vehicle integration of personal devices, such as phones, tablets, cameras and wearables
- Community sharing enables passengers to share music, images, videos and more with other passengers.
Volkswagen
Cockpit
Volkswagen showed Digital Cockpit, 3D technology with two screens, set one behind the other, create a 3D feel with an amazing impression of depth. The multi-dimensional presentation combined with excellent image quality makes learning to use the display quicker and easier than ever before. Another new technology is Eyetracking, which recognizes where the driver is looking inside the vehicle. The technology can reduce the amount of information shown when the display is not being viewed. Graphical animations are intentionally shown only when the driver looks at the screen. At the same time, users reach the control they want faster, as there are no longer any intermediate steps in the menu, and fewer controls to be accessed through the steering wheel. The AR Head-up Display projects information graphics in virtual form ahead of the vehicle. This technology is beneficial in that it is less tiring on the driver’s eyes than traditional vehicle displays. In terms of function and feel, Volkswagen goes far beyond what other manufacturers have presented in this realm, showing information on two levels:
- Level 1—Data that is relevant to the route or distance to the vehicle ahead appears on the road, several meters ahead of the vehicle. As a result of the natural positioning on the road itself, the display fits seamlessly into the surroundings. The driver now understands information more easily than ever before, significantly reducing driver distraction.
- Level 2: All other data, such as the infotainment, are presented by an AR Head-up Display closer to the windshield. Here, drivers are able to access any personally relevant information without having to take their eyes off the road.
Volkswagen I.D. Concept
The car’s central infotainment display and multifunction steering wheel builds on the control concepts of the Golf R Touch (CES 2015) and BUDD-e (CES 2016). The Interactive Experience system’s central infotainment display features multi-finger recognition, allowing users to adjust the air conditioning system or audio volume via a menu. Tactile feedback and smart illumination help users find their way around without having to take their eyes off the road. The same principles are used on the multi-function steering wheel’s control pods, improving ease of use.
The visionary I.D. combines the digitally connected world with an electric powered car that, when desired, can drive in full autonomous mode. The concept car made its world premiere last year at the Paris Motor Show. . The production version of the I.D. is planned to launch as early as 2020, with plans for fully autonomous driving by 2025. The car’s autonomy, named I.D. Pilot, is activated by touching the VW logo on the steering wheel, signaling it to disappear into the instrument panel, making more usable space for the driver-turned-passenger. Drivers will open and turn on the I.D. using their smart phone as a Digital Key. Their personal seat and air-conditioning settings, favorite radio stations and media playlists, sound system settings, contact details of friends and business partners, and the configuration of their navigation systems will all be uploaded through their Volkswagen User-ID. Also on board will be advanced technologies such as the AR Head-up Display and Eyetracking. The I.D. combines the brand’s themes of Connected Community, Intuitive Usability and Smart Sustainability, adding fully-autonomous driving to it all.
Ford Autonomous at Home
Ford showed some cars on the sidewalks outside the Las Vegas Convention Center and held news conferences. The company is working on autonomous vehicles and more connections for its services.
Ford announced that it is tripling its fleet of Fusion Hybrid autonomous research vehicles this year – making the company’s fully autonomous vehicle fleet the largest of all automakers – and accelerating the development and testing of its virtual driver software in both urban and suburban environments. Fusion Hybrid sedans were chosen for the second-generation vehicles because they have the newest and most advanced electrical architecture. With the latest generation of computers and sensors – including the smaller, but more advanced Velodyne LiDAR HDL-32E sensor – Ford’s autonomous vehicle platform moved a step closer to production.
The new fleet vehicles will use Velodyne’s advanced new Solid-State Hybrid Ultra PUCK Auto sensor, providing precision required for mapping and creating accurate, real-time 3D models of the surrounding environment, enhancing Ford’s software development and testing to handle a broader range of driving scenarios.
Ford autonomous vehicles are part of Ford Smart Mobility, the plan to take the company to the next level in connectivity, mobility, autonomous vehicles, the customer experience, and data and analytics .
Ford Smart Home
Ford is exploring linking smart devices like Amazon Echo and Wink to its vehicles to allow consumers to control lights, thermostats, security systems and other features of their homes from their car, and to stop, start, lock, unlock and check their vehicle’s fuel range from the comforts of their couch.
Ford is looking to use new SYNC Connect technology to link vehicles with the Amazon cloud-based voice service
Ford is working to link the home automation devices with its vehicles through industry-leading Ford SYNC. This comes as half of consumers say they will buy at least one smart home product in the next year, according to Icontrol Networks.
BMW
BMW Group, Intel and Mobileye announced that a fleet of approximately 40 autonomous BMW vehicles will be on the roads by the second half of 2017. The BMW 7 Series will employ cutting-edge Intel and Mobileye technologies during global trials starting in the U.S. and Europe.
BMW 5 Series Sedan demoed automated drive, where drivers no longer need to operate the accelerator or brake pedal and can also take their hands off the steering wheel, allowing them to concentrate on other activities instead. An extra onboard computer continuously cross-checks the vehicle’s position and data about its surroundings against a highly detailed digital roadmap, resulting in very accurate lane-keeping. The drive concluded with the Robot Valet Parking service – a fully automated parking procedure.
The BMW i Inside Future sculpture at CES showcased BMW HoloActive Touch. BMW Connected demos featured in-car voice-controlled personal digital assistant that personal digital assistant that personal digital assistant that personal using Microsoft’s Cortana.
Amazon Prime Now, which is integrated into all the user’s devices (both in and outside the vehicle) via the Open Mobility Cloud, enables goods to be ordered through the app through the the app while drivers are on the way to their next destination. The En-Route Delivery then delivers the package to the vehicle.
With the Air Touch system, featured at CES 2016 in the BMW i Vision Future Interaction concept car, BMW presented a panoramic display that can be operated just like a touchscreen – except that there is no actual contact involved. Now this system has been taken a stage further with BMW HoloActive Touch. BMW HoloActive Touch fuses the advantages of the BMW Head-Up Display, BMW gesture control and intuitive touchscreen functionality. This innovative interface between the driver and vehicle consists of a free-floating virtual display which is projected in the area above the center console. The system is operated directly by finger movements, while an ultrasound source provides tactile confirmation of the driver’s commands.
The BMW Connected Window illustrates the possibilities these developments offer for a personalized and intelligent enhancement of digital lifestyles. The BMW Connected Window integrates every type of information relevant for daily mobility planning. Using the Open Mobility Cloud, this virtual window offers digital functions to support personal daily planning and numerous other aspects of individual lifestyles.
BMW is nominated for a Tech CARS Award, please vote for your favorites.
Nissan
The next Nissan LEAF will have a semi-autonomous mode using ProPILOT technology, enabling autonomous drive functionality for single-lane highway driving.
Nissan announced the Seamless Autonomous Mobility (SAM) system. Developed from NASA technology, SAM partners in-vehicle artificial intelligence (AI) with human support to help autonomous vehicles make decisions in unpredictable situations and build the knowledge of in-vehicle AI.
With SAM, the autonomous vehicle becomes smart enough to know when it should not attempt to negotiate the problem by itself. Instead, it brings itself to a safe stop and requests help from the command center. The request is routed to the first available mobility manager – a person who uses vehicle images and sensor data (streamed over the wireless network) to assess the situation, decide on the correct action, and create a safe path around the obstruction. Once clear of the area, the vehicle resumes fully autonomous operations, and the mobility manager is free to assist other vehicles calling for assistance.
As this is all happening, other autonomous vehicles in the area are also communicating with SAM. The system learns and shares the new information created by the Mobility Manager. Once the solution is found, it’s sent to the other vehicles.
Nissan is working with Microsoft to used Microsoft’s personal assistant technology Cortana to make driving more efficient and enjoyable. Cortana will allow the vehicle to adapt to personalized driver settings, even understanding different driver preferences in a shared vehicle, almost making it feel like your own. They will create next-generation connected and mobility services for cars using Microsoft Connected Vehicle Platform, which is built on Azure, Office 365, Cortana and other intelligent cloud services provided by Microsoft.
Nissan also is working to use EVs as energy sources. Integration of EVs into society will help energy distribution.across the grid, and vehicle-to-home (V2H), vehicle-to-building (V2B), and vehicle-to-grid (V2G) solutions.
The Bose new sound management technology is intended to help drivers better process and react to the increasing amount of audible information produced by today’s and next generation’s cars, such as safety prompts, navigation signals, vehicle system alerts, Bluetooth phone calls, and text-to-speech messages.
Bose’s new technology utilizes UltraNearfield headrest speakers and Bose proprietary algorithms to place non-entertainment signals in virtual spaces around the driver, where it intuitively makes the most sense.
If you are new to AUTO Connected Cars News and like what you are reading, you are welcome to subscribe to our newsletters.
Honda
Honda showcased projects with startup companies LEIA and VocalZoom at CES 2017. In collaboration with the Honda Xcelerator, these two innovators’ technologies aim to revolutionize the in-cabin experience.
Together, Honda Xcelerator and LEIA have developed a new 3D driver display demonstration that leverages nanotechnology to provide seamless transitions between different viewing angles for warnings and driver-assistive systems. VocalZoom has been working with Honda Xcelerator to apply its optical microphone technology to improve voice interaction inside the vehicle.
Although 3D can be distracting if it isn’t designed correctly, LEIA’s nanotech approach presents depth in a way that feels natural. As a result, when the driver moves his or her head while looking at the screen, the content changes aspect continuously and allows for multi-view 3D imagery.
VocalZoom’s optical sensor “reads” facial skin vibrations during speech, enabling it to isolate a driver’s voice from all of the other background sounds in the car. The result is clean, isolated driver commands that are significantly easier for automotive voice-recognition systems to understand and obey than was previously possible with traditional voice-control solutions. In a proof-of-concept demo at Honda’s CES display, attendees can experience the VocalZoom technology.
Honda offered first proof-of-concept demonstration of in-vehicle payments with infrastructure parking and fueling partners at 2017 CES in Las Vegas as part of its ongoing partnership with Visa Inc.
Drivers are notified that they can pay for fuel or parking when they are near a smart parking meter or fuel pump. Depending on the services, the purchase amount is displayed in the dashboard and drivers confirm payment with the touch of a button. Honda is currently in discussion with a number of other companies that will continue to help ease the various innovative payment processes of other car-based transactions.
Honda unveiled its Cooperative Mobility Ecosystem concept at CES 2017 , connecting the power of artificial intelligence, robotics and big data to transform the mobility experience of the future and improve customers’ quality of life. Featuring a number of prototype and concept technology demonstrations at CES, the Honda concept envisions a future where vehicles will communicate with each other and infrastructure to mitigate traffic congestion and eliminate traffic fatalities, while increasing the productivity of road users and delivering new types of in-vehicle entertainment experiences. Vehicles will create new value by autonomously providing services when not in use by their owners.
Honda introduced the Honda NeuV, an electric automated mini-vehicle concept equipped with an artificial intelligence (AI) “emotion engine”* and automated personal assistant, creating new possibilities for human interaction and new value for customers.
Designed to create new possibilities for customers, the NeuV (pronounced “new-v”), which stands for New Electric Urban Vehicle, is a concept vehicle whose genesis is based on the fact that privately-owned vehicles sit idle 96 percent of the time. The NeuV explores the idea of how to create new value for its owner by functioning as an automated ride sharing vehicle, picking up and dropping off customers at local destinations when the owner is not using the car. The NeuV also can sell energy back to the electric grid during times of high demand when it’s not in use. These activities have the potential to create a new business model for enterprising customers.
The NeuV features a full touch panel interface enabling both the driver and passenger to access a simple and convenient user experience. The vehicle has two seats, a storage area in back, and an electric skateboard for “last mile” transit. The NeuV also features outstanding outward visibility via a headerless windshield and a dramatically sloping belt line that make maneuvering easy.
At CES, Honda launched its “Safe Swarm” concept, which utilizes bio-mimicry – replicating the behavior of a school of fish – to create a safer, more efficient and enjoyable driving experience. The Honda Safe Swarm demonstration immerses visitors in a world where vehicles sharing the road communicate with one another using dedicated short range communication (DSRC) to support the driver in negotiating complex driving situations. The Safe Swarm concept enables vehicles to operate cooperatively, enabling more efficient, low-stress and, ultimately, collision-free mobility.
Audi & NVIDIA
The most popular autonomous ride at CES was at the Gold lot from NVIDIA and Audi. It was sold out and many celebrities from all over the world couldn’t get a ride including AUTO Connected Car News.
Audi of America President Scott Keogh and NVIDIA Founder and CEO Jen-Hsun Huang announced expansion into artificial intelligence. The Audi Q7 piloted driving concept vehicle uses NVIDIA DRIVE PX 2 platform to navigate real-road complexities.
The vehicle relies on its trained AI neural networks to recognize and understand its environment, then drive safely around the track without any computer programming. With no driver behind the wheel, it performed several laps on a closed course, where the configuration of the track was modified in the middle of the demonstration. The course features a variety of road surfaces including areas with and without lane markings, dirt and grass, as well as a simulated construction zone with cones and dynamic detour indicators.
Audi will expand testing of highly automated, artificial intelligence-equipped vehicles on public roads in California and select states next year.
In parallel to delivering artificial intelligence solutions for complex urban driving, Audi brings to market this year the world’s first vehicle to meet the standards of Level 3 automation as defined by SAE International. The next generation Audi A8 will feature Traffic Jam Pilot, which uses a central driver assistance controller, or zFAS, with NVIDIA hardware and software. This system will give drivers the option to turn over steering, throttle, and braking functionality to the vehicle at speeds of up to 35 mph when certain conditions are met, aiding Audi drivers during their often stressful freeway commutes.
NVIDIA Mercedes-Benz
Mercedes-Benz and NVIDIA announced a partnership to bring an NVIDIA AI-powered car to market. The work is part of an ongoing collaboration focused on deep learning and artificial intelligence. Guy Kawasaki welcomes Sajjad Khan and NVIDIA Co-Founder Jen-Hsun Huang to speak at the CES.
Part of the thrill and enjoyment of owning a Porsche (mine’s a classic 911) is driving. That is why we bought the car in the first place. Not that i am against self-driving cars. All i am saying is, why spoil a good thing. You spend a great deal of good money maintaining this sexy beast, getting an awesome company to maintain it is half the story. I guess it will take a long time before i buy into a self-driving car, even if its a sexy Porsche.