Decoding neural responses to sounds in patients with disorders of consciousness for diagnostics and unlocking communication
A keystone in the diagnosis of patients with disorders of consciousness is the detection of command following, distinguishing patients in a vegetative state (i.e., awake, showing reflex movements but unconscious) from patients in a minimally conscious state (i.e., showing inconsistent but reproducible first signs of recovery of consciousness). Clinical assessment of these patients relies on the clinician’s ability to detect a behavioral response following an instruction (e.g., “squeeze my hand”). However, recent studies have shown that some of these patients can produce volitional brain responses to command while no behavioral responses can be detected by clinicians, highlighting the importance of developing motor-independent diagnostic tools for this population. While promising, the first attempts showed high misdiagnosis rate. This could be explained by the limitations of the proposed methods regarding the clinical reality of these patients (e.g., fluctuation of the patient’s vigilance over time, ability to understand instructions). In this proposal, we aim to develop a novel objective and motor-independent diagnostic tool based on brain-computer interface technology, integrating information about patient’s ability to hear, perceive and understand instructions, and self-adapting to the fluctuation in the state of the patient (e.g., vigilance, motivation and alertness).