• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

BCI-controlled exoskeleton: neurorehabilitation for patients with impaired lower limb function

At the international symposium ExoRehab Spotlights 2018 held on December 5 in Moscow, researchers of the Center for Bioelectric Interfaces Nikolai Smetanin, Aleksandra Kuznetsova and Alexei Ossadtchi presented Russia's first EEG-based neural interface that uses lower limb motor imagery for exoskeleton control. Alexei Ossadtchi also made a presentation "BCI for walk decoding". This work has been a collaboration with the Russian company ExoAtlet.

Megagrant #14.641.31.0003 "Bi-directional ECoG BCIs for contol, stimulation and communication". Lead scentist: Mikhail Lebedev.

Link to video on NTV: https://www.ntv.ru/novosti/2117481/

Exoskeletons have found use in clinical rehabilitation of patients with impaired lower limb function. However, controlling exoskeletons requires some physical action, such as a button press, which is far from how normal motor actions are initiated. There is a hypothesis that lower limb movement following activation of appropriate motor areas of the cortex may accelerate rehabilitation of patients by promoting neural plasticity.
While several studies have demonstrated efficiency of BCI-assisted therapy of upper limbs, there is a lack of similar research on patients suffering from lower limb impairments. Developing lower limb motor imagery-based BCIs to drive an exoskeleton has been challenging due to EEG signal contamination with artifacts created by the exoskeleton electronics, intensive body movements and tonic muscle activity.
We have designed a loop that includes an exoskeleton and a BCI, in which each locomotor act is initiated only if the patient manages to imagine sufficiently well moving their legs.
The video demonstrates the neural interface in action. The top left part of the screen shows two cues for the patient: to idle (red sign) or imaging moving their legs (green arrow with a sound cue). The bottom left displays decoding results: red for idling (the exoskeleton does not move) and green for movement (the exoskeleton makes a step).

This work was funded by Megagrant #14.641.31.0003 of the Government of the Russian Federation to support research conducted in Russian institutions of higher education, research organizations and state research centers under the Subprogram of Institutional Development of the Research Sector (State Program "Developing Science and Technology" in 2013-2020).