Arnaud Lagandré (left), Ulrich Lueders
Maximising the time we spend in a car has become a major focus for manufacturers as the auto industry prepares for automated driving. Tomorrow’s cockpit will be digital and immersive to create a more personalised, versatile and connected user experience (UX) allowing all occupants to make the most of their time onboard. To learn more about smart cockpits and how the market place for LiDAR is evolving, we caught up with Ulrich Lueders, Director Strategy & Portfolio, Business Unit Human Machine Interface at Continental and Arnaud Lagandré, Head of Business Unit Advanced Driver Assistance Systems North America at Continental.
How do you see electric vehicles influencing the design of the cockpit area?
Ulrich Lueders: Along with the electrification of mobility, we see a strong trend towards digitalisation of the vehicle interior. Many electric vehicles come with large displays instead of analogue elements. For vehicle manufacturers, electric vehicles are not just cars with a different propulsion technology but rather a clean slate for new design approaches and a clear focus on user experience (UX). We recognised this trend early on and have started working on holistic UX concepts together with vehicle manufacturers rather than only offering components for a modern and innovative interior.
The consumers’ car buying decisions, by the way, are more and more driven by the user experience. Not least because former key differentiators such as horsepower become less relevant with electric vehicles. You could say that UX is the new horsepower.
Presumably using eyes, voice and hand gestures, it is possible to eliminate buttons from an infotainment system. What is your vision of this touch-free user experience?
Ulrich Lueders: You are right – eye gaze tracking, voice interaction and gestures can greatly reduce the amount of touch interaction that is needed to operate the infotainment system. However, we do not believe in a completely touch-free user experience in the near future. Consider that the tactile sense marks a significant portion of an immersive, holistic user experience. Our vision of the future cockpit is digital, immersive and “shy”.
Digital: There will be many and/or large displays in the car, that are seamlessly orchestrated by a central computing unit such as our Cockpit High Performance Computer (Cockpit HPC). This enables a consistent visual- and interaction design across all displays and other devices in the vehicle – may it be touch, gesture, speech or any other kind of interaction.
Immersive: Creating a user experience with more depth, we have recently presented a 3D display technology that works without glasses or eye-tracking and is able to provide a natural 3D visualisation to all vehicle occupants at the same time. Interaction with the holographic 3D elements popping out of the screen is possible via gestures or touch with haptic feedback. In combination with 3D audio systems, this will make the cockpit experience even more immersive.
Shy: Beyond traditional displays, there will be hidden (or shy) displays behind a translucent material with a natural look and texture of for example wood or leather. The display will only become visible in case there is relevant information to present. Touching such a shy display really adds to the tactile experience in the car. When the display is inactive, you will just see the natural surface. This is not only a breakthrough technology for interior designers, but also a great thing for users considering how overwhelming information overflow can be, especially while driving a car.
Hence, touch will not be eliminated any time soon. The interaction of the future is going to be multi-modal in a way that functions can be approached in multiple ways. It will become more common to interact via touch, voice, haptics and gestures.
What are your thoughts on gesture controls? Do you think we could see more of them in mass-produced cars or limited to high-end, niche applications?
Basic gesture control will not necessarily be an exclusive feature for high-end cars any longer.
Ulrich Lueders: Gesture control can be a supportive means of interaction in the car, if we are talking about simple and intuitive gestures. We see a certain potential for this functionality to expand into more cars in the future. A decisive factor is the cost for the interior sensing technology which is needed to realize gesture recognition. Looking at the upcoming legislation for driver monitoring systems, interior cameras will also move into lower segment vehicles. With this, basic gesture control will not necessarily be an exclusive feature for high-end cars any longer.
The thought of bringing more data into cars raises the question of driver distraction. How is Continental addressing this area?
Ulrich Lueders: We do not only have engineers developing nice and innovative products, but we do also have psychologists, ergonomists and UX designers constantly working on HMI (human-machine interface) solutions that are not only joyful to use but also easy to use. Safety and reduction of driver distraction are at the focus of their work. In addition, technology itself can help to minimise driver distraction. To give a few examples:
Cabin sensing: Understanding the driver’s condition and state of mind massively helps to determine what type and what amount of information they are able to handle in a certain situation. The management of information will be linked with this knowledge.
Digital companion: Proactive and situation-aware virtual assistants enable low-distraction interaction between driver and car. Based on the cabin sensing data, a digital companion can provide information via audio when the driver’s eyes need to stay focused on the road.
Privacy technologies: While we see a growing potential for entertainment features in cars it is essential to take care that these additional features do not distract the driver as long as the car is not driving fully automated. There are even first cars with dedicated passenger displays on the market. Continental is developing solutions that help to entertain the passenger without distracting the driver. For example, we are developing a special backlight technology for passenger displays to provide privacy and minimise driver distraction when the passenger is watching a movie. Another example would be our new feature for the Ac2ated Sound system that enables personal sound zones.
What opportunities do Speakerless 3D immersive audio system open up for Continental?
Ulrich Lueders: With our speakerless Ac2ated Sound system, we have revolutionised the field of in-car audio that has basically stayed the same for decades. The many requests we keep on receiving from vehicle manufacturers shows us that we are really striking a chord with this technology.
With the significantly lower weight and installation volume of the actuators be can contribute to lower CO2 emissions worldwide. In comparison, to achieve 3D sound, traditional high-end audio systems in vehicles often require between 10 and 20 or even more speakers. These systems usually weigh up to 40 kilograms and have a total installation volume of 10 to 30 litres. However, modern vehicle architectures and interior concepts require increasingly efficient, lighter and space-saving solutions to sustainably improve the user experience of vehicle occupants. By removing conventional loudspeakers from the vehicle and replacing them with small and light actuators that save the given car surfaces up to 90% weight and installation space, we offer all car occupants significantly more freedom of design and more space. The combination of the two technologies is therefore particularly attractive for electric and hybrid vehicles due to this reduction in weight and installation space.
Overall, however, besides these technical benefits, Ac2ated Sound enables a more seamless interior design and immersive user experience. Leveraging the patented AMBEO 3D audio technology from our partner and audio specialist Sennheiser, we are creating an invisible, immersive, real-life experience for all listeners in the vehicle. Addressing the audible portion of the user experience, Ac2ated Sound is a valuable lever for us when it comes to creating holistic UX concepts.
What opportunities do virtual personal assistants open up for Continental?
Ulrich Lueders: The HMI in future vehicles will become more flexible in order to be personalised and adapted to the user’s needs. One key enabler for this is the Digital Companion system which provides room for various new user experiences. It will be a door opener for a variety of services, especially for those that would be difficult to operate blindly. For example, the selection of a restaurant in an area you haven’t been to before would be much easier if you could just talk to your car and it makes suggestions based on your personal preferences. Virtual assistants will also play a major role in building up the driver’s trust into the increasing automation by providing relevant ad hoc information about driving manoeuvres or potentially critical situations that require the return of the steering to the human driver. The overall effectiveness of a virtual assistant will tremendously increase with the amount of knowledge it has. Therefore, we are also working on cabin sensing technologies helping our Digital Companion to better understand the mood of driver and passengers. This will be essential to enable a personal and situation-aware virtual assistant.
How important is the aesthetic design of the audio system for today’s consumer?
Ulrich Lueders: Carrying around the entire world of music in their pockets, many consumers today attach big value to a premium quality in-car audio system. Also, the willingness to pay extra for an audio system mainly depends on the premium audio quality. We also see that the production volume of branded vehicle audio systems has been strongly increasing in the past years and it will continue to increase. However, audio quality is not the only decisive factor. About 50% of the people intending to buy a new car also consider the design of the audio system is very important. We are convinced that we are perfectly serving these trends with our speakerless Ac2ated Sound solution in partnership with premium audio brand Sennheiser.
LiDAR is said to be the most important of the sensor suite that enables the different levels of driver autonomy. Yet one of the challenges for manufacturers with this type of sensor, in particular, is to find reliability and robustness along with economic viability. How does your fifth radar solution address this?
Arnaud Lagandré: The key word on the way towards higher automation levels is redundancy. None of the available sensor technologies camera, LiDAR or radar will alone be able to realise automated driving functionalities. Therefore, it is important to have a comprehensive sensor portfolio, like ours, to be able to go the next steps towards autonomous mobility. Using different physical principles, you ensure the highest level of reliability in comprehending your environment.
At that stage, high resolution LiDARs are more expensive compared to the other two more mature technologies; it is just starting to enter the automotive market but they will undoubtedly reach the maturity and cost competitiveness required to play a crucial role in automated driving systems.
Our ARS 540 long range radar, which was just honored with a CES 2021 Innovation Award, showcases how powerful sensor technology can be. As the first production-ready 4D image radar, its capability to measure elevation in addition to speed, range and azimuth angle allows detection with higher accuracy of the most varied of objects, even the relatively small ones. This is crucial to manage complex driving scenarios. In combination with our Smart and Satellite cameras, our LiDARs, as well as our HPC for the needed computing power, our sensor & fusion portfolio offers everything needed for autonomous mobility.
Achieving the performance standards necessary for SAE Levels 3 – 5 driver autonomy at lower costs requires a fresh approach. Can you explain a little about your partnership with AEye and what are you trying to achieve?
Arnaud Lagandré: With the partnership with AEye Continental expands its LiDAR technology portfolio and with this further strengthening the machine vision and environmental sensor portfolio for Automated Driving: camera, radar and LiDAR. The aim is to jointly develop a high-performance, long-range LiDAR sensor based on AEye’s patented agile architecture that utilises a novel advanced micro-MEMS technology operating at 1550nm wavelength. With this addition, Continental now offers two unique LiDAR technologies in its portfolio. For short-range applications: the 3D Flash LiDAR technology and for long-range applications: AEye’s advanced micro-MEMS technology.
How do you see the market place for LiDAR’s evolving?
The LiDAR market remains very dynamic where the requirements for components change rapidly.
Arnaud Lagandré: For Continental, it is clear that it is necessary to use different LiDAR technologies depending on the Use Case and application requirements, because each technology has its own inherent individual strength. Therefore, we see the need for the two complementary LiDAR technologies – short-range and long-range – to be able to react to different market needs. The LiDAR market remains very dynamic where the requirements for components change rapidly. With an extended LiDAR technology portfolio, we are well positioned for future businesses and an optimal partner for our (potential) customers. The investment in AEye is an investment in our future LiDAR technology portfolio.
And finally, Continental’s transparent trailer technology was awarded for a CES innovation award. What is the benefit of it?
Arnaud Lagandré: The Transparent Trailer technology extends Continental’s trailering portfolio to allow drivers to “see through” a trailer in haul and check the area behind and beside it. Based on Continental’s Surround View system, the technology enables safer driving while towing. Two cameras and a control unit work together to provide a panoramic view that renders the trailer virtually invisible. The result is a seamless live feed for drivers to see the road and any obstacles behind the trailer.