.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "generated/auto_examples/core/tutorial_eoec.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. or to run this example in your browser via Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_generated_auto_examples_core_tutorial_eoec.py: .. _tutorial-eoec: Eyes Open vs. Closed Classification =================================== EEGDash example for eyes open vs. closed classification. This example uses the :mod:`eegdash` library in combination with PyTorch to develop a deep learning model for analyzing EEG data, specifically for eyes open vs. closed classification in a single subject. 1. **Data Retrieval Using EEGDash**: An instance of :class:`eegdash.api.EEGDashDataset` is created to search and retrieve an EEG dataset. At this step, only the metadata is transferred. 2. **Data Preprocessing Using BrainDecode**: This process preprocesses EEG data using Braindecode by reannotating events, selecting specific channels, resampling, filtering, and extracting 2-second epochs, ensuring balanced eyes-open and eyes-closed data for analysis. 3. **Creating train and testing sets**: The dataset is split into training (80%) and testing (20%) sets with balanced labels, converted into PyTorch tensors, and wrapped in DataLoader objects for efficient mini-batch training. 4. **Model Definition**: The model is a shallow convolutional neural network (ShallowFBCSPNet) with 24 input channels (EEG channels), 2 output classes (eyes-open and eyes-closed). 5. **Model Training and Evaluation Process**: This section trains the neural network, normalizes input data, computes cross-entropy loss, updates model parameters, and evaluates classification accuracy over six epochs. .. GENERATED FROM PYTHON SOURCE LINES 22-30 Data Retrieval Using EEGDash ---------------------------- This section instantiates :class:`eegdash.api.EEGDashDataset` to fetch the metadata for the experiment before requesting any recordings. First we find one resting state dataset. This dataset contains both eyes open and eyes closed data. .. GENERATED FROM PYTHON SOURCE LINES 30-33 .. code-block:: Python from pathlib import Path cache_folder = Path.home() / "eegdash" .. GENERATED FROM PYTHON SOURCE LINES 34-41 .. code-block:: Python from eegdash import EEGDashDataset ds_eoec = EEGDashDataset( query={"dataset": "ds005514", "task": "RestingState", "subject": "NDARDB033FW5"}, cache_dir=cache_folder, ) .. rst-class:: sphx-glr-script-out .. code-block:: none [10/13/25 00:28:51] WARNING Cache directory does not exist, creating api.py:726 it: /home/runner/eegdash ╭────────────────────── EEG 2025 Competition Data Notice ──────────────────────╮ │ This notice is only for users who are participating in the EEG 2025 │ │ Competition. │ │ │ │ EEG 2025 Competition Data Notice! │ │ You are loading one of the datasets that is used in competition, but via │ │ `EEGDashDataset`. │ │ │ │ IMPORTANT: │ │ If you download data from `EEGDashDataset`, it is NOT identical to the │ │ official │ │ competition data, which is accessed via `EEGChallengeDataset`. The │ │ competition data has been downsampled and filtered. │ │ │ │ If you are participating in the competition, │ │ you must use the `EEGChallengeDataset` object to ensure consistency. │ │ │ │ If you are not participating in the competition, you can ignore this │ │ message. │ ╰─────────────────────────── Source: EEGDashDataset ───────────────────────────╯ .. GENERATED FROM PYTHON SOURCE LINES 42-72 Data Preprocessing Using Braindecode ------------------------------------ `braindecode `__ is a specialized library for preprocessing EEG and MEG data. In this dataset, there are two key events in the continuous data: **instructed_toCloseEyes**, marking the start of a 40-second eyes-closed period, and **instructed_toOpenEyes**, indicating the start of a 20-second eyes-open period. For the eyes-closed event, we extract 14 seconds of data from 15 to 29 seconds after the event onset. Similarly, for the eyes-open event, we extract data from 5 to 19 seconds after the event onset. This ensures an equal amount of data for both conditions. The event extraction is handled by the custom function :func:`eegdash.hbn.preprocessing.hbn_ec_ec_reannotation`. Next, we apply four preprocessing steps in Braindecode: 1. **Reannotation** of event markers using :func:`eegdash.hbn.preprocessing.hbn_ec_ec_reannotation`. 2. **Selection** of 24 specific EEG channels from the original 128. 3. **Resampling** the EEG data to a frequency of 128 Hz. 4. **Filtering** the EEG signals to retain frequencies between 1 Hz and 55 Hz. When calling the `preprocess` function, the data is retrieved from the remote repository. Finally, we use `create_windows_from_events` to extract 2-second epochs from the data. These epochs serve as the dataset samples. At this stage, each sample is automatically labeled with the corresponding event type (eyes-open or eyes-closed). `windows_ds` is a PyTorch dataset, and when queried, it returns labels for eyes-open and eyes-closed (assigned as labels 0 and 1, corresponding to their respective event markers). .. GENERATED FROM PYTHON SOURCE LINES 74-131 .. code-block:: Python from braindecode.preprocessing import ( preprocess, Preprocessor, create_windows_from_events, ) import numpy as np from eegdash.hbn.preprocessing import hbn_ec_ec_reannotation import warnings warnings.simplefilter("ignore", category=RuntimeWarning) # BrainDecode preprocessors preprocessors = [ hbn_ec_ec_reannotation(), Preprocessor( "pick_channels", ch_names=[ "E22", "E9", "E33", "E24", "E11", "E124", "E122", "E29", "E6", "E111", "E45", "E36", "E104", "E108", "E42", "E55", "E93", "E58", "E52", "E62", "E92", "E96", "E70", "Cz", ], ), Preprocessor("resample", sfreq=128), Preprocessor("filter", l_freq=1, h_freq=55), ] preprocess(ds_eoec, preprocessors) # Extract 2-second segments windows_ds = create_windows_from_events( ds_eoec, trial_start_offset_samples=0, trial_stop_offset_samples=256, preload=True, ) .. rst-class:: sphx-glr-script-out .. code-block:: none Downloading dataset_description.json: 0%| | 0.00/1.00 [00:00` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: tutorial_eoec.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: tutorial_eoec.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_