A new iPhone app from Microsoft may help people with ALS communicate more quickly. The strategy, known as GazeSpeak, uses the iPhone camera to capture eye movements and using computer vision technologies, translates them into words. The approach, according to preliminary results, appears to help people with ALS communicate more than 40% faster than existing e-tran boards and operates up to nearly 90% accuracy.
The iPhone-based device is more portable and less expensive than existing infrared-based eye-tracking devices such as the Tobii Dynavox. And, the software requires no recalibration under similar lighting conditions according to Microsoft.
The iPhone app will be presented by the Microsoft Enable team at the ACM Conference on Human Factors in Computer Systems in Denver, Colorado in May 2017. The source code will be made freely available to enable web developers to improve the performance and/or adapt the app for other smart devices.
The iPhone app, according to Microsoft’s Meredith Ringel Morris, will be available for download at no charge at the Apple App Store in early May 2017.
Zhang, X, Kulkarni, H, Morris, MR. Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities. Conference on Human Factors in Computing (CHI) 2017. Published online, January 2017.