Objective. Mobile Brain/Body Imaging (MoBI) frameworks allowed the research community to find evidence of cortical involvement at walking initiation and during locomotion. However, the decoding of gait patterns from brain signals remains an open challenge. The aim of this work is to propose and validate a deep learning model to decode gait phases from Electroenchephalography (EEG). Approach. A Long-Short Term Memory (LSTM) deep neural network has been trained to deal with time-dependent information within brain signals during locomotion. The EEG signals have been preprocessed by means of Artifacts Subspace Reconstruction (ASR) and Reliable Independent Component Analysis (RELICA) to ensure that classification performance was not affected by movement-related artifacts. Main results. The network was evaluated on the dataset of 11 healthy subjects walking on a treadmill. The proposed decoding approach shows a robust reconstruction (AUC > 90%) of gait patterns (i.e. swing and stance states) of both legs together, or of each leg independently. Significance. Our results support for the first time the use of a memory-based deep learning classifier to decode walking activity from non-invasive brain recordings. We suggest that this classifier, exploited in real time, can be a more effective input for devices restoring locomotion in impaired people.

Deep learning-based BCI for gait decoding from EEG with LSTM recurrent neural network

Tortora S.
;
Ghidoni S.;Micera S.;
2020

Abstract

Objective. Mobile Brain/Body Imaging (MoBI) frameworks allowed the research community to find evidence of cortical involvement at walking initiation and during locomotion. However, the decoding of gait patterns from brain signals remains an open challenge. The aim of this work is to propose and validate a deep learning model to decode gait phases from Electroenchephalography (EEG). Approach. A Long-Short Term Memory (LSTM) deep neural network has been trained to deal with time-dependent information within brain signals during locomotion. The EEG signals have been preprocessed by means of Artifacts Subspace Reconstruction (ASR) and Reliable Independent Component Analysis (RELICA) to ensure that classification performance was not affected by movement-related artifacts. Main results. The network was evaluated on the dataset of 11 healthy subjects walking on a treadmill. The proposed decoding approach shows a robust reconstruction (AUC > 90%) of gait patterns (i.e. swing and stance states) of both legs together, or of each leg independently. Significance. Our results support for the first time the use of a memory-based deep learning classifier to decode walking activity from non-invasive brain recordings. We suggest that this classifier, exploited in real time, can be a more effective input for devices restoring locomotion in impaired people.
File in questo prodotto:
File Dimensione Formato  
ghidoni_deep_bci.pdf

non disponibili

Tipologia: Published (publisher's version)
Licenza: Accesso privato - non pubblico
Dimensione 3.31 MB
Formato Adobe PDF
3.31 MB Adobe PDF Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3351210
Citazioni
  • ???jsp.display-item.citation.pmc??? 8
  • Scopus 73
  • ???jsp.display-item.citation.isi??? 66
  • OpenAlex ND
social impact