Drivers no longer interact with the vehicle simply by pushing buttons and pulling levers. Human-machine interface (HMI) systems are even evolving to the point where drivers can have a conversation with the vehicle. Here, Fatima Vital, Director of Marketing, Automotive & Consumer Electronics at Nuance Communications (Nuance), tells Automotive World about the changing face of connectivity and its influence on the next stage of man-machine interaction, aptly named HMI 2.0.
“A wide variety of factors influence HMI,” says Vital, highlighting three key megatrends that will drastically change the way consumers interact with their cars: increasing connectivity, growing levels of vehicle automation and new mobility models. “Very often people refer to connectivity as connectivity to infotainment, but what we see is beyond infotainment – connectivity to the Cloud and providing access to data at any time, to other devices, cars and smart cities,” she says. “It will also involve connectivity to the driver itself, including things like health monitoring and wellness.”
If the car detects a situation that is potentially stressful for the driver, or if there is a potential crash risk, it is important that the dialogue adjusts and puts communications on hold
However, chief of interest to Nuance is sensor connectivity, an important aspect to the company’s Dragon Drive technology. This Cloud-based system is embedded within the vehicle’s infotainment system and combines natural language understanding (NLU) and text-to-speech functionality. Dragon Drive currently supports more than 29 languages, and enables access to apps and services such as music, weather reports and social media simply by talking to the car. What’s more, all functions are available from the main menu, which means the user can simply ask, ‘Where is the closest gas station,’ or ‘Call my wife,’ without having to go through step-by-step dialogue. The system is referred to as an ‘automotive assistant’.
“One important aspect we are working on is leveraging information that comes from the car – things like crash avoidance sensors or fuel sensors – and putting that information into the conversation with the driver,” explains Vital. “For instance, if the car detects a situation that is potentially stressful for the driver, or if there is a potential crash risk, it is important that the dialogue adjusts and puts communications on hold.” Nuance has been carrying out studies into this topic, looking into both the success rate of certain dialogue, but also driver satisfaction as a result of the ‘conversation’.
As Vital affirms, “Connectivity is certainly impacting the vehicle HMI.”
New mobility needs a seamless experience
Upcoming trends in ‘new mobility’ are also expected to change the way vehicle HMI is designed. Rapidly increasing levels of connectivity in vehicles mean the automotive industry is no longer simply about driving a car, and ownership is no longer the only way of guaranteeing regular access to a car.
In densely populated urban areas and in future, mega cities, the increasing trend among young people will be to leverage car share services as opposed to vehicle ownership. This has an important impact on HMI development, says Vital. “Drivers will be using different cars all the time, and the same car will be used by different drivers all the time. In an ideal world, the driver will be able to bring in his or her HMI profile so there is a consistent user experience across these cars,” she explains. “That is also of course linked to connectivity, because this is something that could be stored in the Cloud.”
The transportation experience starts at home when I am planning the trip, which could consist of walking, public transportation and car sharing, so there has to be a seamless integrated experience for me as a user
The vehicle’s HMI system should also be able to learn from past experiences – for instance if the driver has particular navigation or point of interest (POI) preferences. “If I am looking for a car park, it means something different to someone else – I might not be willing to walk as far, whereas someone else might be willing to walk if it is less expensive,” suggests Vital. “I might need a car park with certain requirements like easy access for children and so on. This is about personalisation, and that is a key aspect of HMI 2.0.”
New mobility is not simply about driving a car, but more about the broader topic of ‘door-to-door transportation’, says Vital. In future, she expects new mobility to include various transportation means. “It is transportation beyond the car,” she explains. “The transportation experience starts at home when I am planning the trip, which could consist of walking, public transportation and car sharing, so there has to be a seamless integrated experience for me as a user, and very personalised to my personal needs,” she advised.
The impact of autonomy
Once merely a theoretical concept in the periphery, autonomous driving is now firmly in the sights of many players – Nuance included. How drivers interact with a vehicle that has the ability to drive itself is a question that still remains unanswered, but Nuance is keen to highlight the potential benefits its Dragon Drive automotive assistant could offer in future.
“Changes expected through autonomous driving will have an impact on our lives and society in general, but looking at the HMI in particular, there will be an increasing shift from driving tasks to so-called secondary tasks. So, instead of having unshared attention tothe driving situation, we see infotainment- and productivity-focussed tasks playing a greater role,” she said. “We see these services shifting with an increasing shift to infotainment for movies and other tasks that are more visually related than they are today.”
With innovative technology being pumped into new cars to facilitate a highly connected and assisted driving experience, is it about simplifying something that has become increasingly complex? Vital thinks so: “Assistants in the car will play an increasing role. We have been looking at how people interact with the cars, and believe that our Dragon Drive automotive assistant can – throughout the journey – perfectly meet personal driving-related needs, and can do some of the thinking for the driver while he or she stays focussed on a safe drive.”
Changes expected through autonomous driving will have an impact on our lives and society in general, but looking at the HMI in particular, there will be an increasing shift from driving tasks to so-called secondary tasks
These automotive assistants have to be of automotive grade quality – that is, more intelligent and smarter than ever. Nuance is investigating artificial intelligence (AI) and NLU technology which enables these systems to learn about the driver. Vital expects that the driver will be able to ask for things in generic terms, such as: ‘Find me an expensive restaurant.’
“Although it sounds very simple, if I ask to find a restaurant, the restaurant has to be open and has to accept reservations,” points out Vital. “If I say ‘tonight’ in the context of a restaurant, that means probably between six and ten o’clock. If I say ‘tonight’ in the context of a movie, it’s probably between eight and 11pm. Things have different meanings in different contexts, and this has to be represented by a very intelligent system,” she affirms.
Nuance currently has NLU systems in use on the current BMW 7 Series. However, “This will have to evolve much further to understand these kinds of things,” Vital admitted. “If it is raining, then the term ‘near’ probably has a different meaning for me than if it is sunny. All of this has to be understood by the automotive assistant, and is a key research area for Nuance.” She advised that this is currently being built into the Dragon Drive system for future deployment.
HMI 2.0 – a fair concept?
Will vehicle HMI systems change to such a degree that HMI 2.0 is a fair description? And will there be a significant step change in years to come? “It does make sense,” says Vital. “We see more and more functionality coming into the system. However, what we expect – and have had requested from car makers – is to make it more integrated. If there is an email saying, ‘Should we meet tonight at 7pm at a specific restaurant’, we want to extract that information and make it available by directing it to the user’s calendar and navigation system.”
Things have different meanings in different contexts, and this has to be represented by a very intelligent system
In the coming two to three years, Vital expects the automotive assistant to become increasingly intelligent, and in five years will be able to anticipate driving-related needs based on various sources of information. “It could be information from me personally – my emails, my address, my calendars, and anticipating where I will have to go,” she says. “This would help to anticipate what I am likely to do and actively propose a schedule, and how I should manage these multiple tasks. We see the automotive assistant playing much more of a ‘real’ role, not just helping me with the tasks I tell it to.”
In essence, automotive assistants today are designed to understand, but in future, will be able to anticipate requirements and make active recommendations. In order to get to this point, however, steps need to be taken to converge the various modes of HMI. Voice biometrics, for example, is expected to become a common feature, and will need to be integrated with touch, haptic and predictive interaction through the automotive assistant. “The integration of different modalities playing a role in the HMI needs to be considered. Today, voice and haptic modalities are still fragmented or separated, and a tight integration of those modalities will be a key characteristic of HMI 2.0,” Vital concludes.
This article is part of an exclusive Automotive World report on connected cars. Follow this link to download a copy of ‘Special report: Connected cars‘