Paper ID: 2409.00035
EEG Right & Left Voluntary Hand Movement-based Virtual Brain-Computer Interfacing Keyboard with Machine Learning and a Hybrid Bi-Directional LSTM-GRU Model
Biplov Paneru, Bishwash Paneru, Sanjog Chhetri Sapkota
This study focuses on EEG-based BMI for detecting voluntary keystrokes, aiming to develop a reliable brain-computer interface (BCI) to simulate and anticipate keystrokes, especially for individuals with motor impairments. The methodology includes extensive segmentation, event alignment, ERP plot analysis, and signal analysis. Different deep learning models are trained to classify EEG data into three categories -- `resting state' (0), `d' key press (1), and `l' key press (2). Real-time keypress simulation based on neural activity is enabled through integration with a tkinter-based graphical user interface. Feature engineering utilized ERP windows, and the SVC model achieved 90.42% accuracy in event classification. Additionally, deep learning models -- MLP (89% accuracy), Catboost (87.39% accuracy), KNN (72.59%), Gaussian Naive Bayes (79.21%), Logistic Regression (90.81% accuracy), and a novel Bi-Directional LSTM-GRU hybrid model (89% accuracy) -- were developed for BCI keyboard simulation. Finally, a GUI was created to predict and simulate keystrokes using the trained MLP model.
Submitted: Aug 18, 2024