A novel transformer for attention decoding using EEG

Electroencephalography (EEG) attention classification plays a crucial role in brain-computer interface (BCI) applications. This paper introduces EEG-PatchFormer, a novel deep learning model leveraging transformers to achieve superior EEG attention decoding. We posit that transformers’ strength in ca...

Full description

Bibliographic Details
Main Author: Lee, Joon Hei
Other Authors: Guan Cuntai
Format: Final Year Project (FYP)
Language:English
Published: Nanyang Technological University 2024
Subjects:
Online Access:https://hdl.handle.net/10356/175057
_version_ 1826121031114293248
author Lee, Joon Hei
author2 Guan Cuntai
author_facet Guan Cuntai
Lee, Joon Hei
author_sort Lee, Joon Hei
collection NTU
description Electroencephalography (EEG) attention classification plays a crucial role in brain-computer interface (BCI) applications. This paper introduces EEG-PatchFormer, a novel deep learning model leveraging transformers to achieve superior EEG attention decoding. We posit that transformers’ strength in capturing long-range temporal dependencies, coupled with their recent success on spatial data, makes them ideally suited for processing EEG signals. We begin by outlining a pilot study investigating the impact of various patching strategies on the classification accuracy of a transformer-based network. This study revealed significant performance variations across patching methods, emphasising the importance of optimal patching for model efficacy. We then showcase the proposed EEG-PatchFormer architecture. Key modules include a temporal convolutional neural network (CNN), a pointwise convolutional layer, and separate patching modules to handle global and local spatial features, as well as temporal features. The model then features a transformer module, and culminates in a fully-connected classifier. Finally, EEG-PatchFormer’s performance across various evaluation experiments is discussed. Extensive evaluation on a publicly available cognitive attention dataset demonstrated that EEG-PatchFormer surpasses existing state-of-the-art benchmarks in terms of mean classification accuracy, area under the ROC curve (AUC), and macro-F1 score. Hyperparameter tuning and ablation studies were carried out to further optimise, and understand the contribution of, individual components. Overall, this project establishes EEG-PatchFormer as a state-of-the-art model for EEG attention decoding, with promising applications for BCI.
first_indexed 2024-10-01T05:26:06Z
format Final Year Project (FYP)
id ntu-10356/175057
institution Nanyang Technological University
language English
last_indexed 2024-10-01T05:26:06Z
publishDate 2024
publisher Nanyang Technological University
record_format dspace
spelling ntu-10356/1750572024-04-19T15:45:41Z A novel transformer for attention decoding using EEG Lee, Joon Hei Guan Cuntai School of Computer Science and Engineering CTGuan@ntu.edu.sg Computer and Information Science EEG Deep learning Attention Brain-computer interface Electroencephalography (EEG) attention classification plays a crucial role in brain-computer interface (BCI) applications. This paper introduces EEG-PatchFormer, a novel deep learning model leveraging transformers to achieve superior EEG attention decoding. We posit that transformers’ strength in capturing long-range temporal dependencies, coupled with their recent success on spatial data, makes them ideally suited for processing EEG signals. We begin by outlining a pilot study investigating the impact of various patching strategies on the classification accuracy of a transformer-based network. This study revealed significant performance variations across patching methods, emphasising the importance of optimal patching for model efficacy. We then showcase the proposed EEG-PatchFormer architecture. Key modules include a temporal convolutional neural network (CNN), a pointwise convolutional layer, and separate patching modules to handle global and local spatial features, as well as temporal features. The model then features a transformer module, and culminates in a fully-connected classifier. Finally, EEG-PatchFormer’s performance across various evaluation experiments is discussed. Extensive evaluation on a publicly available cognitive attention dataset demonstrated that EEG-PatchFormer surpasses existing state-of-the-art benchmarks in terms of mean classification accuracy, area under the ROC curve (AUC), and macro-F1 score. Hyperparameter tuning and ablation studies were carried out to further optimise, and understand the contribution of, individual components. Overall, this project establishes EEG-PatchFormer as a state-of-the-art model for EEG attention decoding, with promising applications for BCI. Bachelor's degree 2024-04-19T01:36:34Z 2024-04-19T01:36:34Z 2024 Final Year Project (FYP) Lee, J. H. (2024). A novel transformer for attention decoding using EEG. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/175057 https://hdl.handle.net/10356/175057 en SCSE23-0162 application/pdf Nanyang Technological University
spellingShingle Computer and Information Science
EEG
Deep learning
Attention
Brain-computer interface
Lee, Joon Hei
A novel transformer for attention decoding using EEG
title A novel transformer for attention decoding using EEG
title_full A novel transformer for attention decoding using EEG
title_fullStr A novel transformer for attention decoding using EEG
title_full_unstemmed A novel transformer for attention decoding using EEG
title_short A novel transformer for attention decoding using EEG
title_sort novel transformer for attention decoding using eeg
topic Computer and Information Science
EEG
Deep learning
Attention
Brain-computer interface
url https://hdl.handle.net/10356/175057
work_keys_str_mv AT leejoonhei anoveltransformerforattentiondecodingusingeeg
AT leejoonhei noveltransformerforattentiondecodingusingeeg