Lynn Walford September 25, 2018

Nuance Dragon Drive Demonstration. Credit: Nuance Automotive

Wouldn’t it be nice if, instead of fumbling through screen menus or hitting buttons or knobs, you could just tell your car what do and it would do it? Cars are getting more conversational and, in the future, they may even start anticipating your needs and desires.

Typical types of functions that can be initialized for voice commands include music apps, navigation, telephone calls, trip information and climate control, says Dan Cauchy, Executive Director of Automotive Grade Linux (AGL), which is an open source project that brings together automakers, suppliers and technology companies.

AGL has a working group that is creating a software API (Application Programming Interface) that can use Alexa or Nuance voice engines and can be easily ported to different cars and will not have to be hand coded. AGL members include Toyota, Mazda, Mercedes-Benz, Nissan, Subaru and Suzuki. Cauchy says the API could enable thousands of apps in vehicles.

Although automakers don’t usually reveal what exact technology is used in their vehicles until they are released, Volvo announced that is working with Google to integrate Google Assistant in future models. When Amazon announced its Automotive Alexa Software Development Kit (SDK), it reported that major automakers like BMW, Ford, SEAT and Toyota are already working to integrate Alexa directly into their vehicles. Alexa’s functions for automotive include calling, navigation, audio streaming, destination and searching for points of interest.

To initiate a conversation in your car it can be done with the voice button on the steering wheel or by a key phrase, such as ‘Alexa’, ‘Hey Siri’ or ‘Mercedes’ or whatever the automaker determines, says Robert Policano, Solutions Manager for Nuance Automotive.

Nuance provided the skill set for and the voice engine for ‘Hey Mercedes’ (MBUX’s LINGUATRONIC) launched earlier this year in the Mercedes-Benz A-Class. Mercedes understands the questions and replies. When the user asks ‘Should I take an umbrella today?’ the voice system checks the weather forecast and tells the driver the answer based on the weather. ‘Hey Mercedes’ even knows what do if the user says, ‘Hey Mercedes–I’m cold.’ It adjusts the HVAC (heating, ventilation and air conditioning), says Policano.

Demonstration of “Hey Mercedes”. Credit: Nuance Automotive

Nuance is working on natural language for future automotive implementation and is demoing a system called ‘Just Talk’ that is always in a listening mode and does not have to be keyword activated. It uses eye-tracking so that when the driver is looking at a building and asks ‘What is the user rating of that place?’ the car suggests the name of the place and after confirmation, announces the user rating.

Ongoing conversations

Artificial intelligence companies, such as Passage AI, develop the conversations for automakers that work with the voice engines. Passage AI is currently working with five automakers creating different types of use case scenarios. The company programmed a spoken owners’ manual for an automaker and more complex interaction for an unannounced 2019 model, says Ravi Raj, Passage AI’s CEO.

Raj says that voice can be used to open the hood, open doors, open the boot, lock doors or anything that the automaker assigns to an app can be turned into a voice function. The keyword activation is important for privacy. Voice activation could be activated by mistake by a babbling baby or passenger, however, functions such as the boot or hood opening are deactivated while the car is moving. Drivers are also given the choice to shut off the keyword activation.

Passage AI offers a use case scenario to fill the cravings of hungry drivers to order food. Through voice interaction, the voice assistant finds the restaurant, takes the order and asks ‘Do you want fries with that?’ It places the order, alerts the user when the order is ready and confirms payment from the connected phone.

Raj sees a more popular use case scenario. “The killer application for voice is texting. Using voice makes sense. It’s easier and it’s less distracting than pushing buttons and reading things on a screen.”

Making complex vehicle functions easier

“Some car screen menus are seven pages deep to change one setting. With voice, it is much easier, ” says Zachary Bolton, Head of Research and Development, Systems and Technology, Interior Div., Continental North America. Tiremaker Continental is developing advanced skill sets using Artificial Intelligence (AI) for digital assistants for vehicle-related interactions.

Bolton offers the following example. If the engine light is on, the assistant not only tells the driver that the light is on but uses vehicle data to inform the driver that it is an O2 sensor and the driver needs to bring it in for service within 200 miles The assistant may also make an appointment with the local dealer or service operator. In the case of an electric vehicle, the digital assistant could offer a charging station and navigate to the charging station with an available charging port. The digital assistant can be integrated with a calendar and when it realizes that the driver is not going to make it to an appointment on time, offer to notify appointment that the driver is not going to make it. The digital assistant can also book parking near the destination, based on user preferences.

However, Bolton warns that, because voice control is hands-free, it doesn’t mean that it does not cause cognitive blindness. Some uses of voice – even talking intensely to a passenger – can take concentration away from driving. Where voice functions will be truly useful is when cars become fully autonomous. Continental is working on voice for when the vehicle becomes fully automated, then the driver can do deeper cognitive tasks such as using a mobile office.

As the use of voice progresses with fully autonomous cars for car sharing and new mobility options the computer in the car will not only converse with the rider but also be able to identify the rider. Nuance works with financial institutions that use voice as passwords. Policano imagines a time when the rider says, ‘I’m Jane Doe and my voice is my password.’

He predicts that the car will then play the music you like, set the climate control and adjust the lighting. Just how you like it…

Leave a comment.

Your email address will not be published. Required fields are marked*