• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Successful ECoG decoding in real time!

Today marks the successful completion of the first year of our research project funded by Megagrant #14.641.31.0003 "Bi-directional ECoG BCIs for control, stimulation and communication" (lead scientist: Mikhail Lebedev).

Junior research fellow Ksenia Volkova and research assistant Maria Kondratova co-supervised by Mikhail Lebedev and Alexei Ossadtchi successfully completed a series of experiments to decode hand trajectory from ECoG in real-time. The researchers were first to utilize convolutional neural networks to decode raw ECoG data without pre-processing and feature extraction. Achieving this had required designing an optimal convnet architecture and a unique method of teaching the patient to control an avatar hand. Arthur Petrosyan and Alexander Belyayev, associates of the Center, also contributed to algorithm development. Our results show that finger movement decoding accuracy does not depend on the position of upper limbs. This paves the way for a robust and natural control of prosthetic upper limbs based on decoding local field potentials rather than single neuron spiking data with many degrees of freedom.


This work was funded by Megagrant #14.641.31.0003 of the Government of the Russian Federation to support research conducted in Russian institutions of higher education, research organizations and state research centers under the Subprogram of Institutional Development of the Research Sector (State Program "Developing Science and Technology" in 2013-2020).