Visuaalse stiimuliga esilekutsutud potentsiaalidel põhinev roboti juhtimine Emotiv EPOC seadmega
Abstract
Antud töö kirjeldab visuaalse stiimuliga esilekutsutud potentsiaalidel põhinevat aju ning arvuti vahelist liidest (AAL), mis loodi antud töö praktilise osana. AALi saab kasutada aju ja seadme vahelise otsese suhtluskanali loomiseks, mis tähendab, et seadmega suhtlemiseks pole vaja nuppe vajutada, piisab vaid visuaalsete stiimulite vaatamisest. Efektiivne AAL võimaldaks raske puudega isikutel näiteks elektroonilist ratastooli juhtida. Antud töö osana loodud AAL kasutab tuntud kanoonilise korrelatsiooni- ja võimsusspektri analüüsi meetodeid ning uuendusena kombineerib need kaks meetodit üheks teineteist täiendavaks meetodiks. Kahe meetodi kombinatsioon muudab AALi täpsemaks. AALi testiti antud töös vaid pealiskaudselt ning tulemused on järgnevad: ühe käsu edastamise aeg 2,61 s, täpsus 85,81% ning informatsiooni edastamise kiirus 27,73 bitt/min. Antud AAL on avatud lähtekoodiga, kirjutatud Python 2.7 programmeerimiskeeles, sisaldab graafilist kasutajaliidest ning kasutab aju tegevuse mõõtmiseks elektroensefalograafia (EEG) seadet Emotiv EPOC. AALi kasutamiseks on vaja ainult arvutit ja Emotiv EPOC seadet. Koodi muutes on võimalik kasutada ka teisi EEG seadmeid.
This thesis describes an SSVEP-based BCI implemented as a practical part of this work. One possible usage of a BCI that efficiently implements a communication channel between the brain and an external device would be to help severely disabled people to control devices that currently require pushing buttons, for example an electric wheelchair. The BCI implemented as a part of this thesis uses widely known PSDA and CCA feature extraction methods and introduces a new way to combine these methods. Combining different methods improves the performance of a BCI. The application was tested only superficially and the following results were obtained: 2.61 s target detection time, 85.81% accuracy and 27.73 bits/min ITR. The implemented BCI is open-source, written in Python 2.7, has graphical user interface and uses inexpensive EEG device called Emotiv EPOC. The BCI requires only a computer and Emotiv EPOC, no additional hardware is needed. Different EEG devices could be used after modifying the code.
This thesis describes an SSVEP-based BCI implemented as a practical part of this work. One possible usage of a BCI that efficiently implements a communication channel between the brain and an external device would be to help severely disabled people to control devices that currently require pushing buttons, for example an electric wheelchair. The BCI implemented as a part of this thesis uses widely known PSDA and CCA feature extraction methods and introduces a new way to combine these methods. Combining different methods improves the performance of a BCI. The application was tested only superficially and the following results were obtained: 2.61 s target detection time, 85.81% accuracy and 27.73 bits/min ITR. The implemented BCI is open-source, written in Python 2.7, has graphical user interface and uses inexpensive EEG device called Emotiv EPOC. The BCI requires only a computer and Emotiv EPOC, no additional hardware is needed. Different EEG devices could be used after modifying the code.