Posted on February 17, 2017 by New Scientist - NewsMicrosoft app helps people with ALS speak using just their eyes A smartphone app called GazeSpeak uses eye movements to predict the words you want to say, allowing people with motor disabilities to communicate faster