THIS GLOVE CAN TRANSLATE SIGN LANGUAGE INTO SPOKEN WORD IN REAL TIME

A team of researchers at the University of California, Los Angeles, have developed a cool gadget to translate the American Sign Language into speech using a smartphone app. The latest gadget, a glove, provides the translation in real time to better communicate with an audience. Although the wearable device is still in the prototype stage, experts have stated that it could help those who rely on sign language better communicate effectively with non-signers and in real time. The device could also help those learning the sign language.

“Our system offers good mechanical and chemical durability, high sensitivity, quick response, and excellent stretchability,” said Jun Chen, assistant professor of bioengineering at UCLA Samueli School of Engineering.

The gloves are composed of thin, stretchable sensors made of electrically conductive yarn that run along the length of all five fingers. They communicate the finger movement of the wearer to a small, coin-sized circuit board that is worn on the wrist, which in turn transmits the data to a connected smartphone. The American Sign Language relies on facial expressions as well as hand movements, and the gadget is well suited for this as it involves sensors adhered to users’ eyebrows and the sides of their mouths. The device is built around machine learning and is able to recognize 660 signs, including every letter in the alphabet and numbers 1 through 9.  

The assistant professor of bioengineering, Jun Chen said that previous sign language devices have been based on a wide range of techniques, including electromyography, the piezoresistive effect, ionic conduction, the capacitive effect, and photography and image processing. But the inherent complexity of these tools, in addition to how cumbersome they are, has made them little more than proof-of-concept lab experiments.

He went on to state: “For example, vision-based sign language translation systems have high requirements for optimal lighting,” Chen said. “If the available lighting is poor, this compromises the visual quality of signing motion captured by the camera and consequently affects the recognition results. Alternatively, sign language translation systems based on surface electromyography have strict requirements for the position of the worn sensors, which can impact translation accuracy and reliability.”

The vision of the researchers behind this project is that this device could be more realistically used in real-world settings. In addition to not being affected by external influences like light, the UCLA device could be produced inexpensively.

“We are still working to polish the system. It may take three to five years to be commercialized,” Chen said.

A paper describing the work has been published in the Journal Nature Electronics.

Older Post
Newer Post
Close (esc)

TRY OUR FREE INTRODUCTION TO THE DE PROGRAM

Can't make up your mind? Try our Free Introduction To the Digital Entrepreneurship Program and receive a selection of the modules from the actual Program as well as videos introducing you to the program and its chapters. 

DOWNLOAD HERE

Age verification

By clicking enter you are verifying that you are old enough to consume alcohol.

Search

Shopping Cart

Your cart is currently empty.
Shop now