When it comes to haptic feedback, most technologies are limited to simple vibrations. But our skin is loaded with tiny sensors that detect pressure, vibration, stretching and more.
Now, Northwestern University engineers have unveiled a new technology that creates precise movements to mimic these complex sensations.
While sitting on the skin, the compact, lightweight, wireless device applies force in any direction to generate a variety of sensations, including vibrations, stretching, pressure, sliding and twisting. The device, detailed in a study published in the journal Science, also can combine sensations and operate fast or slow to simulate a more nuanced, realistic sense of touch.
Powered by a small rechargeable battery, the device uses Bluetooth to wirelessly connect to virtual reality headsets and smartphones. It also is small and efficient, so it could be placed anywhere on the body, combined with other actuators in arrays or integrated into current wearable electronics.
The researchers envision their device eventually could enhance virtual experiences, help individuals with visual impairments navigate their surroundings, reproduce the feeling of different textures on flat screens for online shopping, provide tactile feedback for remote health care visits and even enable people with hearing impairments to “feel” music.
“Almost all haptic actuators really just poke at the skin,” said Northwestern’s John A. Rogers, who led the device design. “But skin is receptive to much more sophisticated senses of touch. We wanted to create a device that could apply forces in any direction — not just poking but pushing, twisting and sliding. We built a tiny actuator that can push the skin in any direction and in any combination of directions. With it, we can finely control the complex sensation of touch in a fully programmable way.”
A pioneer in bioelectronics, Rogers is the Louis A. Simpson and Kimberly Querrey Professor of Materials Science and Engineering, Biomedical Engineering, and Neurological Surgery, with appointments in the McCormick School of Engineering and Northwestern University Feinberg School of Medicine. He also directs the Querrey Simpson Institute for Bioelectronics. Rogers co-led the work with Northwestern’s Yonggang Huang, the Jan and Marcia Achenbach Professor in Mechanical Engineering and professor of civil and environmental engineering at McCormick. Northwestern’s Kyoung-Ho Ha, Jaeyoung Yoo and Shupeng Li are the study’s co-first authors.
The study builds on previous work from Rogers’ and Huang’s labs, in which they designed a programmable array of miniature vibrating actuators to convey a sense of touch.
The haptic hang-up
In recent years, visual and auditory technologies have experienced explosive growth, delivering unprecedented immersion through devices like high-fidelity, deeply detailed surround-sound speakers and fully immersive virtual-reality goggles. Haptics technologies, however, mostly have plateaued. Even state-of-the-art systems only offer buzzing patterns of vibrations.
This developmental gap stems largely from the extraordinary complexity of human touch. The sense of touch involves different types of mechanoreceptors (or sensors) — each with its own sensitivity and response characteristics — located at varying depths within the skin. When these mechanoreceptors are stimulated, they send signals to the brain, which are translated as touch.