NM000339: eeg dataset, 62 subjects#
Stieger2021
Access recordings and metadata through EEGDash.
Citation: James R. Stieger, Stephen A. Engel, Bin He (2021). Stieger2021. 10.1038/s41597-021-00883-1
Modality: eeg Subjects: 62 Recordings: 598 License: CC-BY-NC-4.0 Source: nemar
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import NM000339
dataset = NM000339(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = NM000339(cache_dir="./data", subject="01")
Advanced query
dataset = NM000339(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{nm000339,
title = {Stieger2021},
author = {James R. Stieger and Stephen A. Engel and Bin He},
doi = {10.1038/s41597-021-00883-1},
url = {https://doi.org/10.1038/s41597-021-00883-1},
}
About This Dataset#
Stieger2021
Motor Imagery dataset from Stieger et al. 2021 [1]_.
Dataset Overview
Code: Stieger2021 Paradigm: imagery DOI: 10.1038/s41597-021-00883-1
View full README
Stieger2021
Motor Imagery dataset from Stieger et al. 2021 [1]_.
Dataset Overview
Code: Stieger2021 Paradigm: imagery DOI: 10.1038/s41597-021-00883-1 Subjects: 62 Sessions per subject: 11 Events: right_hand=1, left_hand=2, both_hand=3, rest=4 Trial interval: [0, 3] s File format: MAT
Acquisition
Sampling rate: 1000.0 Hz Number of channels: 62 Channel types: eeg=62 Channel names: AF3, AF4, C1, C2, C3, C4, C5, C6, CP1, CP2, CP3, CP4, CP5, CP6, CPz, Cz, F1, F2, F3, F4, F5, F6, F7, F8, FC1, FC2, FC3, FC4, FC5, FC6, FCz, FT7, FT8, Fp1, Fp2, Fpz, Fz, O1, O2, Oz, P1, P2, P3, P4, P5, P6, P7, P8, PO3, PO4, PO5, PO6, PO7, PO8, POz, Pz, T7, T8, TP7, TP8 Montage: 10-10 Hardware: Neuroscan SynAmps RT amplifiers Software: Neuroscan Sensor type: EEG Line frequency: 60.0 Hz Online filters: 0.1 to 200 Hz with 60 Hz notch filter Impedance threshold: 5.0 kOhm Cap manufacturer: Neuroscan Cap model: Quik-Cap
Participants
Number of subjects: 62 Health status: healthy Age: min=18, max=63 Gender distribution: male=13, female=49 Handedness: mostly right-handed Species: human
Experimental Protocol
Paradigm: imagery Number of classes: 4 Class labels: right_hand, left_hand, both_hand, rest Tasks: LR, UD, 2D Study design: longitudinal training study with intervention Feedback type: visual Stimulus type: target_bar Stimulus modalities: visual Primary modality: visual Mode: online Instructions: Imagine your left (right) hand opening and closing to move the cursor left (right). Imagine both hands opening and closing to move the cursor up. Finally, to move the cursor down, voluntarily rest; in other words, clear your mind.
HED Event Annotations
Schema: HED 8.4.0 | Browse: https://www.hedtags.org/hed-schema-browser right_hand
├─ Sensory-event, Experimental-stimulus, Visual-presentation
└─ Agent-action
└─ Imagine
├─ Move
└─ Right, Hand
left_hand
├─ Sensory-event, Experimental-stimulus, Visual-presentation
└─ Agent-action
└─ Imagine
├─ Move
└─ Left, Hand
both_hand
├─ Sensory-event, Experimental-stimulus, Visual-presentation
└─ Agent-action
└─ Imagine, Move, Hand
rest
├─ Sensory-event
├─ Experimental-stimulus
├─ Visual-presentation
└─ Rest
Paradigm-Specific Parameters
Detected paradigm: motor_imagery Imagery tasks: left_hand, right_hand, both_hands, rest Cue duration: 2.0 s Imagery duration: 6.0 s
Data Structure
Trials: 450 Blocks per session: 18 Trials context: per_session
Preprocessing
Data state: raw Preprocessing applied: False
Signal Processing
Feature extraction: ERD, ERS, autoregressive model, power spectrum Frequency bands: alpha=[10.5, 13.5] Hz; mu=[8, 14] Hz Spatial filters: Laplacian (C3/C4 with 4 surrounding electrodes)
Cross-Validation
Evaluation type: cross_session
Performance (Original Study)
Accuracy: 70.0% Pvc 1D Threshold: 70.0 Pvc 2D Threshold: 40.0
BCI Application
Applications: cursor_control Environment: laboratory Online feedback: True
Tags
Pathology: Healthy Modality: Motor Type: Active
Documentation
Description: Continuous sensorimotor rhythm based brain computer interface learning in a large population DOI: 10.1038/s41597-021-00883-1 License: CC-BY-NC-4.0 Investigators: James R. Stieger, Stephen A. Engel, Bin He Senior author: Bin He Contact: bhe1@andrew.cmu.edu Institution: Carnegie Mellon University, University of Minnesota Department: Carnegie Mellon University, Pittsburgh, PA, USA; University of Minnesota, Minneapolis, MN, USA Address: Pittsburgh, PA, USA; Minneapolis, MN, USA Country: US Repository: GitHub Data URL: https://doi.org/10.6084/m9.figshare.13123148.v1 Publication year: 2021 Funding: NIH AT009263; NIH EB021027; NIH NS096761; NIH MH114233; NIH EB029354 Ethics approval: University of Minnesota IRB; Carnegie Mellon University IRB Keywords: BCI, sensorimotor rhythm, motor imagery, EEG, longitudinal, learning
Abstract
Brain computer interfaces (BCIs) are valuable tools that expand the nature of communication through bypassing traditional neuromuscular pathways. The non-invasive, intuitive, and continuous nature of sensorimotor rhythm (SMR) based BCIs enables individuals to control computers, robotic arms, wheelchairs, and even drones by decoding motor imagination from electroencephalography (EEG). Large and uniform datasets are needed to design, evaluate, and improve the BCI algorithms. In this work, we release a large and longitudinal dataset collected during a study that examined how individuals learn to control SMR-BCIs. The dataset contains over 600 hours of EEG recordings collected during online and continuous BCI control from 62 healthy adults, (mostly) right hand dominant participants, across (up to) 11 training sessions per participant. The data record consists of 598 recording sessions, and over 250,000 trials of 4 different motor-imagery-based BCI tasks.
Methodology
Participants completed 7-11 online BCI training sessions. Each session consisted of 450 trials across 3 tasks (LR, UD, 2D) with 6 runs total. Each trial: 2s inter-trial interval, 2s target presentation, up to 6s feedback control. Online control used spatial filtering (Laplacian around C3/C4), autoregressive model (order 16) for spectrum estimation, alpha power (12 Hz ± 1.5 Hz) for control signal. Horizontal motion controlled by lateralized alpha power (C4-C3), vertical motion by total alpha power (C4+C3). Control signals normalized to zero mean and unit variance. Cursor position updated every 40 ms.
References
Stieger, J. R., Engel, S. A., & He, B. (2021). Continuous sensorimotor rhythm based brain computer interface learning in a large population. Scientific Data, 8(1), 98. https://doi.org/10.1038/s41597-021-00883-1 Notes .. versionadded:: 1.1.0 Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Hochenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896 Pernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103. https://doi.org/10.1038/s41597-019-0104-8 Generated by MOABB 1.5.0 (Mother of All BCI Benchmarks) https://github.com/NeuroTechX/moabb
Dataset Information#
Dataset ID |
|
Title |
Stieger2021 |
Author (year) |
|
Canonical |
— |
Importable as |
|
Year |
2021 |
Authors |
James R. Stieger, Stephen A. Engel, Bin He |
License |
CC-BY-NC-4.0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{nm000339,
title = {Stieger2021},
author = {James R. Stieger and Stephen A. Engel and Bin He},
doi = {10.1038/s41597-021-00883-1},
url = {https://doi.org/10.1038/s41597-021-00883-1},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 62
Recordings: 598
Tasks: 1
Channels: 60
Sampling rate (Hz): 1000.0
Duration (hours): 615.3526116666666
Pathology: Healthy
Modality: Visual
Type: Learning
Size on disk: 371.5 GB
File count: 598
Format: BIDS
License: CC-BY-NC-4.0
DOI: doi:10.1038/s41597-021-00883-1
API Reference#
Use the NM000339 class to access this dataset programmatically.
- class eegdash.dataset.NM000339(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetStieger2021
- Study:
nm000339(NeMAR)- Author (year):
Stieger2021- Canonical:
—
Also importable as:
NM000339,Stieger2021.Modality:
eeg; Experiment type:Learning; Subject type:Healthy. Subjects: 62; recordings: 598; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/nm000339 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=nm000339 DOI: https://doi.org/10.1038/s41597-021-00883-1
Examples
>>> from eegdash.dataset import NM000339 >>> dataset = NM000339(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset