Facebook researchers have come up with a wearable device that can teach people to feel words on their arms. The Facebook wearable takes cues from Braille and Tadoma, a communication method for people who are both deaf and blind. The prototype has been designed with actuators that, when triggered, cause vibrations on the arm.
Connected to a computer that lets you select different phonemes and sample words, the wearer can feel vibrations on their arm which are represented by sensations from various actuators on the top and bottom of the arm. The device allows the wearer to “read” incoming smartphone messages thanks to the vibrations in the patterns that match up to create individual sounds that then make-up a language.
“The device allows the wearer to “read” incoming smartphone messages thanks to the vibrations in the patterns”
During the study, researchers were able to teach people to feel four different phonemes in three minutes. “Over more than an hour and a half of training, study participants were able to learn to recognise 100 words,” said Ali Israr, the technical lead for the project. Although the device is limited to just four to 10 words per minute, researchers are trying to speed up how quickly the device can transmit words to the arm. On the adoption of the language, MIT Technology Review reported that people have so far been able to learn about 500 words after 100 minutes with 90 percent accuracy.
Could this idea lead to a different kind of smartwatch? Well, Facebook researchers are confident that their device is a game-changing wearable that will help people with hearing and vision impairments get information more efficiently. Presenting their work later this month at the annual CHI conference on human-computer interaction in Montreal, Canada, we are excited to see how this wearable tech device is going to evolve with time.