Hybrid EEG-based BCI user interface for action selection

  • J. Pascual*
  • , R. Lorenz
  • , B. Blankertz
  • , C. Vidaurre
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

1 Citation (Scopus)

Abstract

In this study we developed a Graphical User Interface (GUI) to control a Brain-Computer Interface (BCI) by means of Event Related Potentials (ERP) and/or Motor Imagery (MI). It allows users to select actions to operate an upper-limb neuroprosthesis. The action’s selection is divided into 2 steps: choice and confirmation, which can be controlled using ERP or MI. We also present results of experiments with 12 participants who used this GUI and show that high performance is achieved with all possible combinations of paradigms. The GUI mode in which both the selection and confirmation steps use the ERP paradigm obtains the highest accuracy.

Original languageEnglish
Title of host publicationBiosystems and Biorobotics
PublisherSpringer International Publishing
Pages1171-1175
Number of pages5
DOIs
Publication statusPublished - 2013
Externally publishedYes

Publication series

NameBiosystems and Biorobotics
Volume1
ISSN (Print)2195-3562
ISSN (Electronic)2195-3570

Fingerprint

Dive into the research topics of 'Hybrid EEG-based BCI user interface for action selection'. Together they form a unique fingerprint.

Cite this