
The tech incorporates acoustic sensors into the vehicles, ensuring safety and reliability on the road.
Cars can see the road, but now they’re learning to hear it.
A new wave of acoustic technology could give vehicles the missing sense that cameras and radar can’t provide.
By detecting sirens before they’re visible, picking up the chatter of pedestrians, or even transmitting urgent sounds through a driver’s headrest, researchers are teaching cars to react to the world the way humans do by listening.
“Being able to perceive exterior sounds and attribute them accurately is a crucial part of attentively observing the full traffic environment. After all, many situations on the road are preceded by an acoustic signal. Take an approaching emergency vehicle, for example, which alerts people to its presence by using a siren,” said Moritz Brandes, who leads The Hearing Car project at Fraunhofer IDMT.
Unlike optical systems, which need a clear line of sight, acoustic sensors can pick up what’s happening around corners or in crowded streets. That ability could prove essential for autonomous driving, where every millisecond of awareness matters.
Hearing what vision misses
The demo vehicle, dubbed The Hearing Car, is packed with microphones and AI software that can recognize and classify sounds from the road.
These sensors are designed to stand up to rain, wind, and extreme temperatures, with careful placement ensuring accurate pickup even at highway speeds. Testing has taken the car from Portugal to the Arctic Circle to stress the technology in real conditions.
To make sure drivers don’t miss critical cues, important noises can also be piped directly into the cabin via the headrest. That means a siren, horn, or warning call is not just detected but delivered right to the driver’s ear, helping them respond faster.
The project involves close collaboration with automotive suppliers and manufacturers, who see the potential of acoustic sensing as the next big leap in driver assistance.
Cars that listen back
The same technology that helps vehicles recognize a siren also enables more natural interaction with their passengers. Drivers can issue voice commands like “Open the trunk,” while speaker verification ensures only authorized voices can trigger key actions.
Inside the cabin, researchers are layering on tools to monitor drivers’ health and attention. Short-range radar can measure heart rate and breathing without contact, while mobile EEG headbands track brain activity for signs of fatigue. Voice analysis detects stress or excitement, feeding real-time feedback to occupants.
For passengers, the YourSound system personalizes in-car entertainment, letting people fine-tune audio to their tastes without needing technical knowledge. Acting like a virtual sound assistant, it creates a more comfortable listening environment for everyone.
By combining exterior acoustic sensing with interior monitoring, The Hearing Car shows how vehicles can evolve beyond cameras and radar to become truly attentive machines.
The research is backed by the Ministry for Science and Culture of Lower Saxony, the Volkswagen Foundation, and Germany’s Federal Ministry of Research, Technology and Space under its KI4BoardNet initiative.
The prototype will be showcased from September 9–12 at the IAA MOBILITY 2025 show in Munich.
MasterCard