Communicating On The Go? There May Be an Eye-tracking App For That.

GazeSpeak ALS MND eye tracking iPhone app

iTracker for PALS? A new iPhone app may soon enable people with ALS to communicate on the go. The eye tracking-based approach uses a sticker on the back of the listener’s iPhone which contains 4 groups of letters, in place of an e-tran board. [Image: Zhang et al., 2017; YouTube.]

A new iPhone app from Microsoft may help people with ALS communicate more quickly. The strategy, known as GazeSpeak, uses the iPhone camera to capture eye movements and using computer vision technologies, translates them into words. The approach, according to preliminary results, appears to help people with ALS communicate more than 40% faster than existing e-tran boards and operates up to nearly 90% accuracy.

The iPhone-based device is more portable and less expensive than existing infrared-based eye-tracking devices such as the Tobii Dynavox. And, the software requires no recalibration under similar lighting conditions according to Microsoft.

The iPhone app will be presented by the Microsoft Enable team at the ACM Conference on Human Factors in Computer Systems in Denver, Colorado in May 2017. The source code will be made freely available to enable web developers to improve the performance and/or adapt the app for other smart devices.

The iPhone app, according to Microsoft’s Meredith Ringel Morris, will be available for download at no charge at the Apple App Store in early May 2017.

Reference:

Zhang, X, Kulkarni, H, Morris, MR. Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities. Conference on Human Factors in Computing (CHI) 2017. Published online, January 2017.

AAC communication disease-als eye-tracking topic-newmethods
Share this:
Facebooktwittergoogle_plusmailFacebooktwittergoogle_plusmail