For years, the sense of touch in digital interactions has largely been limited to basic vibrations. While our eyes and ears are treated to high-fidelity visuals and immersive soundscapes through advanced technologies, haptic feedback has remained comparatively rudimentary. However, engineers at Northwestern University are poised to change this, having unveiled a groundbreaking new technology designed to simulate the complex and nuanced sensations of human touch. Their work, detailed in a study published in the journal Science, introduces a wearable device capable of creating precise movements that mimic the intricate ways our skin interacts with the physical world.
How the Technology Works
Yonggang Huang, PhD, a professor in Mechanical and Civil and Environmental Engineering at McCormick who co-led the theoretical work, highlighted the technical achievement of balancing size and force. “Achieving both a compact design and strong force output is crucial,” Huang stated. He added that their team “developed computational and analytical models to identify optimal designs, ensuring each mode generates its maximum force component while minimizing unwanted forces or torques,” underscoring the precision involved in the actuator’s design. The study builds upon previous research from the labs of Rogers and Huang, including work on programmable arrays of miniature vibrating actuators.
Potential Applications Across Industries
The potential applications envisioned by the authors for this new haptic technology are wide-ranging and impactful. They foresee the device eventually being used to significantly enhance virtual experiences, making digital worlds feel more physically interactive. For individuals with visual impairments, the technology could potentially provide tactile feedback to aid in navigating their surroundings. In the realm of online retail, it could reproduce the feeling of different textures on flat screens, allowing consumers to “feel” fabrics or materials before purchasing. The device could also provide crucial tactile feedback for remote healthcare visits, potentially allowing for remote palpation or examination. Intriguingly, the technology could even enable people with hearing impairments to “feel” music, translating auditory experiences into physical sensations.
The Complexity of Human Touch
The reason haptic technology has lagged behind visual and auditory advancements stems largely from the extraordinary complexity inherent in the human sense of touch. This sense relies on different types of mechanoreceptors, each possessing unique sensitivities and response characteristics, located at varying depths within the skin. When these sensors are stimulated, they transmit signals to the brain, which are then interpreted as touch. Replicating this level of sophistication and nuance digitally requires incredibly precise control over the type, magnitude, and timing of stimuli delivered to the skin—a challenge that existing technologies have struggled to overcome.
J. Edward Colgate, PhD, a haptics pioneer and study co-author, elaborated on this difficulty. “Part of the reason haptic technology lags video and audio in its richness and realism is that the mechanics of skin deformation are complicated,” said Colgate, the Walter P. Murphy Professor of Mechanical Engineering at McCormick. He explained that “skin can be poked in or stretched sideways. Skin stretching can happen slowly or quickly, and it can happen in complex patterns across a full surface, such as the full palm of the hand,” illustrating the multifaceted nature of tactile sensation.
The Actuator’s Breakthrough: Full Freedom of Motion
To effectively simulate this complexity, the Northwestern team developed what they describe as the first actuator with full freedom of motion (FOM). This means the actuator is not limited to a single type or a restricted set of movements. Instead, it possesses the capability to move and apply forces in all directions across the skin surface. This dynamic application of force is key to engaging all the different types of mechanoreceptors in the skin, both individually and in combination with one another, thereby enabling the reproduction of a much wider range of tactile sensations.
“It’s a big step toward managing the complexity of the sense of touch,” Colgate stated, emphasizing the significance of the FOM actuator as “the first small, compact haptic device that can poke or stretch skin, operate slow or fast, and be used in arrays,” making it capable of producing a remarkable range of tactile sensations.
Bringing the Virtual World to Life
Beyond replicating everyday tactile experiences, the platform also demonstrated the potential to transfer information through the skin using haptic feedback. By altering the frequency, intensity, and rhythm of the haptic sensations, the team successfully converted the sound of music into physical touch. They found they could even differentiate between various musical instruments simply by changing the direction of the vibrations felt by the users.
“We were able to break down all the characteristics of music and map them into haptic sensations without losing the subtle information associated with specific instruments,” Rogers said. He concluded by emphasizing the broader impact of their work, stating, “It’s just one example of how the sense of touch could be used to complement another sensory experience. We think our system could help further close the gap between the digital and physical worlds. By adding a true sense of touch, digital interactions can feel more natural and engaging.”