Wearble sensors to translate sign language
The American sign language (ASL) evolved as a natural language for the people who cannot hear. For them, it is difficult to be not able to connect with their surroundings as hardly anyone who speaks and hears knows the sign language.
This communication gap can make life frustrating for those who cannot have a conversation with their own family members. It is quite similar to be stuck among foreigners who cannot understand your language, and neither can you comprehend theirs, though both can speak and hear well.
To make life easier for the deaf communities and their loved ones and to make them feel more comfortable and not foreigners in their very own world, a team at the University of California, Los Angeles, designed wearable high tech gloves that detect the hand movements, i.e., the sign language and transcribe it for those who don’t understand the sign language!
This glove-like device has been equipped with thin, stretchable, electrically stimulated sensors spread across the length of the fingers of the gloves which detect the hand movements made by the wearer. The movements are transmitted as electrical signals to a coin-sized circuit on the wrist. From here, the signals are wirelessly sent to a connected smartphone. The smartphone then translates the movements into spoken words.
The speed of the translated spoken words is at the rate of about a one-word per second.
Language is not only about words; its also about expressions! Isn’t it? Who, in the world of text message, would deny this fact as text messages have undoubtedly originated many misunderstandings until the emoticons were made available.
So the gloves can only provide words, what about the emotions?
People who use American Sign Language (ASL) do not only move hands to speak, but their facial expressions are the grammar of their sign language. Keeping this into consideration, the UCLA researchers added sensors that can be be applied to the user’s face to detect expressions. The sensors are pasted in between the eyebrows and side of their mouths to detect facial expressions.
The wearable glove is not only inexpensive but also lightweight as opposed to the previous devices that offered ASL translation.
Jun Chen, an assistant professor of bioengineering at the UCLA Samueli School of Engineering and the principal investigator on the research, said:
“Our hope is that this opens up an easy way for people who use sign language to communicate directly with non-signers without needing someone else to translate for them. In addition, we hope it can help more people learn sign language themselves.”
Chen further added:
‘UCLA has filed for a patent on the technology. A commercial model based on this technology would require added vocabulary and an even faster translation time’
Source: Wearable-tech glove translates sign language into speech in real time
https://newsroom.ucla.edu/releases/glove-translates-sign-language-to-speech