< Back to previous page

Publication

Improved Motion Classification with an Integrated Multimodal Exoskeleton Interface

Journal Contribution - Journal Article

Human motion intention detection is an essential part of the control of upper-body exoskeletons. While surface electromyography (sEMG)-based systems may be able to provide anticipatory control, they typically require exact placement of the electrodes on the muscle bodies which limits the practical use and donning of the technology. In this study, we propose a novel physical interface for exoskeletons with integrated sEMG- and pressure sensors. The sensors are 3D-printed with flexible, conductive materials and allow multi-modal information to be obtained during operation. A K-Nearest Neighbours classifier is implemented in an off-line manner to detect reaching movements and lifting tasks that represent daily activities of industrial workers. The performance of the classifier is validated through repeated experiments and compared to a unimodal EMG-based classifier. The results indicate that excellent prediction performance can be obtained, even with a minimal amount of sEMG electrodes and without specific placement of the electrode.
Journal: Frontiers in Neurorobotics
ISSN: 1662-5218
Volume: 15
Publication year:2021
Keywords:human-machine interface, Classification, exoskeletons, machine learning, Intention recognition, EMG, Wearable sensor
BOF-keylabel:yes
IOF-keylabel:yes
BOF-publication weight:1
Authors:Regional
Authors from:Government, Higher Education
Accessibility:Open