A group of engineering students at Carnegie Mellon University, Bhargav Bhat, Hemant Sikaria, Jorge L. Meza and Wesley Jin demonstrated their project â€˜HandTalkâ€˜ a sensor equipped glove that translates finger and hand gestures into spoken words. This project got off the ground to enable the communication between deaf persons and persons that do not have knowledge of the Sign language. This is the first demonstrator model to show the functionality based on a limited vocabulary of 32 words which is not bad for a version number of v0.1
The HandTalk works like this: sensors in the glove pick up gestures and transmit the data wirelessly via Bluetooth to a cell phone which runs a Text to Speech software. The sensor data are converted first into text and then to voice output. A person not knowledgeable in Sign language can listen via the cell phone what the other person is saying in Sign language form.
I like the simplicity and the (very) cheap components these students used to create this amazing and truly interactive glove that could help to improve greatly the communication barrier between deaf persons and people not familiar with the Sign language.
A brilliant idea. Check out the excellent organized project documentation which is available online. A highly interesting and inspirational source of a wearable interactive glove.