Brain–Computer Interface Allows Communication With Locked-In Patients Interview with:

Dr. Ujwal Chaudhary, PhD Institute of Medical Psychology and Behavioral Neurobiology University of Tübingen Tübingen, Germany

Dr. Ujwal Chaudhary

Dr. Ujwal Chaudhary, PhD
Institute of Medical Psychology and Behavioral Neurobiology
University of Tübingen
Tübingen, Germany What is the background for this study?

Response: Amyotrophic Lateral Sclerosis (ALS) is a progressive neurodegenerative disorder which causes an Individual to be in Locked-in state (LIS), i.e. the patients have control of their vertical eye movement and blinking, and ultimately in Completely Locked-in state (CLIS), i.e, no control over their eye muscle. There are several assistive and augmentative (AAC) technology along with EEG based BCI which can be used be by the patients in LIS for communication but once they are in CLIS they do not have any means of communication.  Hence, there was a need to find an alternative learning paradigm and probably another neuroimaging technique to design a more effective BCI to help ALS patient in CLIS with communication.

There we based on our past experience we decided to design fNIRS based Auditory BCI for communication in CLIS patients. Patients were encountered with an auditory paradigm, where in each session consisted of 20 known question, formulated based on life of patients, 10 questions requiring a “yes” answer and 10 questions requiring a “no”.  While the patients were thinking “yes” and “no” the change in concentration of oxy-hemoglobin during the “yes”  and “no” thinking of patients, was extracted and was used to build a support vector machine (SVM) model. Once the SVM model with classification accuracy of at least 70% was achieved, patients were provided feedback of their answer, i.e. now at the end of their thinking period, computer said, “You answer was yes”, or, “Your answer was no”. After successful feedback session with known answers, patients were auditorily presented open question, i.e., questions which can only be answered by patients (for example, “Are you in Pain”, and any other question which the family members would like to ask), and again they thought “yes” or “no” and the computer said “Your answer was yes/no” depending on what they were thinking (Chaudhary et al., 2017). What are the main findings?

Response: The most significant finding was the ability of fNIRS based auditory BCI to discriminate between “yes” and “no” thinking of CLIS patients with 70-75% classification accuracy and provide reliable answer of the open questions which were presented to the patients. Patients also reported to be “Happy” which is in agreement with other quality of life studies in these patients, where they are reported to have a good quality of life (Lule et la., 2013). What should readers take away from your report?

Response: Life is worth living when patients have positive social environment. What recommendations do you have for future research as a result of this study?

Response: Need to design a simple, better and cost-effective communication system which can be used by patients who have no means of communication. Is there anything else you would like to add?

Response: As of now this system provides only binary “yes” and “no” communication. This is a work in progress and we are working on implementing several improvement to make this system user friendly and more versatile. Thank you for your contribution to the community.


Ujwal Chaudhary, Bin Xia, Stefano Silvoni, Leonardo G. Cohen, Niels Birbaumer. Brain–Computer Interface–Based Communication in the Completely Locked-In State. PLOS Biology, 2017; 15 (1): e1002593 DOI: 10.1371/journal.pbio.1002593

Note: Content is Not intended as medical advice. Please consult your health care provider regarding your specific medical condition and questions.

More Medical Research Interviews on

[wysija_form id=”5″]


Last Updated on February 12, 2017 by Marie Benz MD FAAD