
Duke's quadruped robot. Source: Duke University
Ever since robots have been invented, these objects tend to gather information based on visual capabilities. They have been deprived of touch, smell, and hearing ever since then. However, a framework to help them achieve these senses is already being worked upon.
Researchers from Duke University have developed a unique framework that enables robots to perceive objects outdoors, much like humans do. Named as WildFusion, this framework enables robots to sense vibrations and touch, along with their visual capabilities and move accordingly.
What is WildFusion?
“WildFusion opens a new chapter in robotic navigation and 3D mapping,” said Boyuan Chen, an Assistant Professor at Duke University. “It helps robots to operate more confidently in unstructured, unpredictable environments like forests, disaster zones and off-road terrain.”
WildFusion has already been granted entry to the IEEE International Conference on Robotics and Automation (ICRA 2025), which will be held May 19-23, 2025, in Atlanta, Georgia.
Yanabaihui Lui, lead student author and a second-year PhD student shed light on how WildFusion removes the sole dependency of robots on visual capabilities.
“Typical robots rely heavily on vision or LiDAR alone, which often falter without clear paths or predictable landmarks. Even advanced 3D mapping methods struggle to reconstruct a continuous map when sensor data is sparse, noisy or incomplete, which is a frequent problem in unstructured outdoor environments. That’s exactly the challenge WildFusion was designed to solve,” she said.
What does WildFusion consist of?
Built on a quadruped robot, the WildFusion technology includes an RGB camera, LiDAR, inertial sensors, contact microphones, and tactile sensors. In usual scenarios, the camera and LiDAR play a crucial role, capturing information such as the environment’s geometry, distance, color, and other visual details. The use of acoustic vibrations and touch is what differentiates WildFusion from the rest.
The contact microphones record the unique vibrations as the robot walks, capturing sounds that have subtle differences. Tactical sensors, on the other hand, determine how much force must be applied to each foot. It can also assess how much the robot wobbles or rolls around when functioning on uneven terrain.
Specialized encoders are then used to process all types of sensory data to create a single picture in the mind of the robot. WildFusion is further powered by a deep learning model built on implicit neural representations.
Instead of viewing the environment as separate data points, this method captures surfaces and features as continuous structures. This enables the robot to make more intelligent and instinctive choices about where to move—even when its line of sight is unclear or obstructed.
Has it been tested?
WildFusion was tested at the Eno River State Park in North Carolina near Duke’s campus, successfully helping a robot navigate dense forests, grasslands and gravel paths. “Watching the robot confidently navigate terrain was incredibly rewarding,” Liu said. “These real-world tests proved WildFusion’s remarkable ability to accurately predict traversability, significantly improving the robot’s decision-making on safe paths through challenging terrain.”
While the potential applications of such technology are vast, including search and rescue operations and infrastructure inspection, it also raises important considerations about the ethical deployment of autonomous systems in sensitive environments. Ensuring that these technologies are developed and used responsibly will be crucial as they become more integrated into various aspects of society.
MasterCard