Back to Explorer
Research PaperResearchia:202602.24040[Bio-AI Interfaces > Neuroscience]

EEG-Driven Intention Decoding: Offline Deep Learning Benchmarking on a Robotic Rover

Ghadah Alosaimi

Abstract

Brain-computer interfaces (BCIs) provide a hands-free control modality for mobile robotics, yet decoding user intent during real-world navigation remains challenging. This work presents a brain-robot control framework for offline decoding of driving commands during robotic rover operation. A 4WD Rover Pro platform was remotely operated by 12 participants who navigated a predefined route using a joystick, executing the commands forward, reverse, left, right, and stop. Electroencephalogram (EEG) signals were recorded with a 16-channel OpenBCI cap and aligned with motor actions at Delta = 0 ms and future prediction horizons (Delta > 0 ms). After preprocessing, several deep learning models were benchmarked, including convolutional neural networks, recurrent neural networks, and Transformer architectures. ShallowConvNet achieved the highest performance for both action prediction and intent prediction. By combining real-world robotic control with multi-horizon EEG intention decoding, this study introduces a reproducible benchmark and reveals key design insights for predictive deep learning-based BCI systems.


Source: arXiv:2602.20041v1 - http://arxiv.org/abs/2602.20041v1 PDF: https://arxiv.org/pdf/2602.20041v1 Original Link: http://arxiv.org/abs/2602.20041v1

Submission:2/24/2026
Comments:0 comments
Subjects:Neuroscience; Bio-AI Interfaces
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!