Microsoft has developed a new smart phone app that interprets eye signals and translates them into letters, allowing people with motor neurone disease to communicate with others from a phone.
The GazeSpeak app combines a smartphone's camera with artificial intelligence to recognize eye movements in real time and convert(改变) them into letters, words and sentences.
For people suffering from ALS(渐冻症), also known as motor neurone disease, eye movement can be the only way they are able to communicate.
“Current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration and substantial, relatively immobile setups,” said Xiaoyi Zhang, a researcher at Microsoft who developed the technology.
“To mitigate the drawbacks…we created GazeSpeak, an eye-gesture communication system that runs on a smartphone, and is designed to be low-cost, robust, portable and easy to learn.”
The app is used by the listener by pointing their smartphone at the speaker. A chart that can be stuck to the back of the smartphone is then used by the speaker to determine which eye movements to make in order to communicate.
The sticker shows four grids(方格) of letters, which each correspond to a different eye movement. By looking up, down, left or right, the speaker selects which grids the letters they want belong to. The artificial intelligence algorithm(程序) is then able to predict the word or sentence they are trying to say.