Gesture Recognition, Proximity Sensors Drive Advances in Automotive Infotainment

Display portlet menu

Gesture Recognition, Proximity Sensors Drive Advances in Automotive Infotainment

blue sketch of a car

At a time when social media interactivity is also making its way into cars, an evolution of the instrument control panel is necessary if drivers are to connect and perform tasks simultaneously, and safely. Improved gesture and speech recognition are two of the more prominent cockpit human-machine interface (HMI) technologies being developed by carmakers to deliver a safe, reliable interaction between driver and vehicle.

Gesture recognition: Hands in command

Gesture recognition technology is widely expected to be the next generation in-car user interface. Gesture recognition determines whether the driver has performed recognizable hand or finger gestures within an allotted space without contacting a touchscreen. For example, an approaching hand can activate the in-car infotainment system, or in a more sophisticated system, the driver can touch the steering wheel then tilt his head left or right to turn the volume of the stereo up or down. A camera placed in the steering wheel or on the dashboard is programmed to watch for certain gestures. When it sees them, it sends a signal to the processor that handles the connected infotainment hardware. The data is analyzed to determine what the driver is doing, ascertain which display controls the driver wants to adjust and then activate the appropriate features.

This technology may well be familiar to gamers. That’s because it is not unlike Microsoft’s Kinect system for the Xbox game console, which detects motion from distances of up to approximately 10 feet. However, rather than tracking a user’s entire body motion as is done with a Kinect system, only the users’ hand gestures are analyzed for automotive infotainment applications.

A market study conducted in 2013 by IHS Automotive examined gesture-recognition technology and proximity sensing. According to IHS, the global market for automotive proximity and gesture recognition systems that allow motorists to control their infotainment systems with a simple wave of their hand will grow to more than 38 million units in 2023, up from about 700,000 in 2013. Automakers including Audi, BMW, Cadillac, Ford, GM, Hyundai, Kia, Lexus, Mercedes-Benz, Nissan, Toyota and Volkswagen are all in the process of implementing some form of gesture technology into their automobiles.

Hyundai’s HCD-14 is a luxury four-door concept sedan featuring gesture controls for audio, HVAC, navigation and smartphone connectivity functions. The driver selects a main function by gazing at a heads-up display (HUD), presses a thumb button on the steering wheel to confirm his/her selection and then performs gestures, which include moving a hand in or out from the dashboard to zoom in or out on the navigation system, a dialing motion to adjust volume and a side-to-side gesture to change radio stations.

Similarly, Visteon’s Horizon cockpit concept, which the company has been demonstrating to global vehicle manufacturers, uses 3-D gesture recognition to transform the way a driver controls such features as interior temperature, audio and navigation. In the Horizon cockpit concept controls can be manipulated by moving the hand or just a finger. Radio volume, for example, can be adjusted by making a turning motion with one’s hand without making contact with the instrument cluster. The gesture recognition technology uses a camera system to map the user’s hand and replicates a virtual hand on the center stack display.

The industry answers with new sensors and tools

Semiconductor suppliers are developing the hardware needed to enable user command input with natural hand and finger movements. For example, Microchip’s MGC3130 is a three-dimensional gesture recognition and tracking controller chip based on the company’s patented GestIC technology, which uses an electric field (E-field) to provide gesture information as well as positional data of the human hand in real time. E-fields are generated by electrical charges and propagate three-dimensionally around the surface carrying the electrical charge. Applying direct voltages (DC) to an electrode results in a constant electric field. In case a person’s hand or finger intrudes the electrical field, the field becomes distorted. GestIC technology uses a minimum number of four receiver (Rx) electrodes to detect the E-field variations at different positions to measure the origin of the electric field distortion from the varying signals received. The information is used to calculate the position, track movements and to classify movement patterns (gestures).

A number of automotive touchscreens now employ proximity sensing, a technology that can enable touch-free interfaces in infotainment systems, keyless entry systems and lighting controls. The Cadillac User Experience was the first system to offer proximity sensing in a mass-market production vehicle. A pair of infrared sensors just below the screen can detect when a user’s hand approaches the screen and activates frequently used menus, such as a list of mixed presets and navigation options.

A detector IC can be used together with up to three separate infrared LEDs to form a sensor that can detect 3-D movements of objects in front of the device. With this information the content of a display can be controlled just by hand waving. Detection ranges up to 20 cm are achievable, and the range can easily be extended by using high power emitters. The SFH 7770 E6 from OSRAM Opto Semiconductors combines the functions of a digital ambient-light sensor with those of a digital proximity sensor. It determines the ambient brightness, detects the presence of an object nearby and can tell whether the object is moving closer or moving further away. This product can be used wherever short-range gesture recognition is needed.

Voice control: Tell me what you want

When traffic situations make driver interaction with infotainment apps potentially hazardous, voice control based on speech recognition can be a useful complement to conventional HMIs. Market research company IHS expects more than half of new automobiles in 2019 to integrate voice technologies such as voice recognition, text-to-speech and speech-to-text to enable drivers to control entertainment and navigation systems simply by using their voices.

In order to do so, speech interfaces in vehicles must overcome a low-adoption rate among drivers, primarily due to perceived accuracy issues. Voice functions must actually work, providing true, reliable voice integration. If it is to make the leap from purely task-oriented command and control to a more sophisticated, user-centric interface designed to negate distraction issues, voice systems must interact in a more conversational way instead of being restricted to a list of fixed, predefined menu phrases.

The good news is that speech recognition systems in the form of hands-free text-to-voice or voice-to-text operations are now delivering greater accuracy and employing more flexible grammar libraries. For example, Ford’s SYNC, based on the Microsoft Windows CE operating system, supports up to 10,000 voice commands with no training required for the system to recognize the commands.

It will take a lot of computer muscle to power next-generation speech-based infotainment applications, enabling them to be context-aware and respond more quickly and with increased accuracy. Some speech functions will be performed on-board by the infotainment system and others, like dictating email or web content, may be performed off-board by servers in the Internet service provider’s data center.

Processors such as the Intel Atom E640 series have the computing headroom to perform critical- noise and echo cancellation functions, thereby eliminating the need for a digital signal processor (DSP) and providing speech applications and far-away listeners on phone calls with a cleaner input signal.

Multimedia information consoles are now the predominant feature of dashboards in new cars. Safety concerns are driving demand for in-car hand gesture and voice recognition technology as a potential solution to driver distraction issues that will be more conspicuous as connectivity on the road becomes a default feature of most new cars. The potential for both technologies in human-machine interface development is immense, and IC manufacturers are quickly integrating the necessary functionality into MCUs and sensors to make it a reality.

Related Articles
interior of autonomous car
The State of Automotive Only Starts with Autonomous Driving
March 6, 2020
Learn how the state of automotive spans a variety of applications, from electrication to in-cabin AI.
view from autonomous vehicle as it interprets street signs
Somewhere Between the Flintstones and the Jetsons
March 6, 2020
Learn more about developing adaptable automotive automation with Xilinx
automobile with sensors collecting information
We’re All Part of the Automotive Evolution
April 23, 2019
ADAS (Advanced Drive Assist Systems) and its evolution into full autonomy could see the end of road-based fatalities, injuries and incidents. Technology is now providing the solutions that can help the automotive industry achieve Vision Zero within j
test car colliding with object
Transportation Safety: 5 Protocols & Processes to Know
April 4, 2019
A dynamic range of protocols can help make our transportation technology safer.
cargo ship at port
Transportation Trends in Commercial & Non-Passenger Vehicles
March 29, 2019
Some of the most exciting technology trends shaping transportation are for commercial vehicles.
futuristic city scene
The Car: A Rolling Smart Device
February 5, 2019
Be it for streaming your favorite music, sending emails or getting real-time information on traffic jams: cars have long since played host to mobile internet—and will continue to via diagnostics, hotspots and ADAS.
man using tablet computer in industrial setting
5 of the best artificial intelligence use cases
January 3, 2019
AI’s upgraded algorithms make predictive analytics, parse data, and help businesses make smarter decisions from the boardroom to the factory floor.
Graphic of a green car flying in the air
Autonomous vehicles are the future
April 9, 2018
Autonomous vehicles do not just replace the driver, helmsman or pilot, but have the potential to create completely new business models worth billions.
Interior of European self-driving car
Meeting Next-Generation Automotive Design Challenges
April 2, 2018
As the automotive industry advances toward Level 5 fully autonomous vehicles, automotive engineers will have to continue driving more functionality out of vehicle subsystems that deliver safety, infotainment, and other features.
smartphone showing alarm feature
Key Design Considerations for Selecting the Right RF Antenna
March 16, 2018
Know when to choose standard, when to go custom.
futuristic automobile interior
How all programmable technology revolutionizes embedded vision
December 14, 2017
Autonomous driving is just the start of EV solutions. Learn how all-programmable is revolutionizing embedded vision.
close-up of orange car headlights
Automotive LEDs Deliver Greater Design Flexibility for Safety, Style and Efficiency
March 7, 2017
LED lighting has been making major inroads into the automotive market for two key reasons: safety concerns and energy consumption.
man drawing schetch of an electric car attached to a battery
Technologies and Components for Designing Electric Vehicles
March 6, 2017
Hybrid electric vehicles (HEVs) such as the Toyota Prius and the Chevy Volt and electric vehicles (EVs) such as the Nissan Leaf, BMW i3 and Tesla Model S are growing in popularity amid concern for global warming.
military plane taking off
Defense/Aero Spinoffs Remain an Exceptional Source for Breakthrough Technology
March 5, 2017
Yes, automatic across the board cuts within the Department of Defense (DoD) will put the squeeze on many within the defense/aerospace supply chain.
self-driving car at intersection
Top 5 Tech Trends in Advanced Driver Assistance Systems
March 4, 2017
Advanced driver assistance systems (ADAS) promise to enhance vehicle safety by helping to simplify the driving process, reducing sources of driver distraction and inattention that often lead to accidents. With ADAS support, drivers and their passenge
truck driver, highway and IoT icons
Getting Started in Automotive Smart Vision Design
March 4, 2017
Advances in embedded vision technology have heightened interest in applying smart vision solutions for automotive safety.
rear view camera on car dash with image of child riding a toy
Top 5 Myths in Automotive Vision: Designing Embedded Vision Systems Is Easier Than You Think
March 3, 2017
Vision has always occupied a special place in information science and popular culture. One does not need to be an engineer to appreciate the vast bandwidth available in normal human vision. Most people understand that the common saying “a picture i
hand navigating smart car dasboard
Automotive Electronics: Top 5 Tech Trends of Tomorrow’s Smart Cars
March 3, 2017
In the United States alone, motor vehicles travel well over four trillion miles each year according to the U.S. Department of Transportation.* To gain a sense of this distance, consider that while it takes light approximately eight minutes to travel
Man navigating an IoT car dashboard
The Internet of Things is Driving The Internet of Autos
February 14, 2017
In the early 1900s Henry Ford made the automobile affordable and accessible. But what he really did was offer people connection. Rural residents could connect with more urban areas to sell crops and buy supplies.
Related Events