NM000329: eeg dataset, 16 subjects#
Brandl2020
Access recordings and metadata through EEGDash.
Citation: Stephanie Brandl, Benjamin Blankertz, Tobias Dahne (2020). Brandl2020. 10.3389/fnins.2020.566147
Modality: eeg Subjects: 16 Recordings: 112 License: CC-BY-NC-ND-4.0 Source: nemar
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import NM000329
dataset = NM000329(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = NM000329(cache_dir="./data", subject="01")
Advanced query
dataset = NM000329(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{nm000329,
title = {Brandl2020},
author = {Stephanie Brandl and Benjamin Blankertz and Tobias Dahne},
doi = {10.3389/fnins.2020.566147},
url = {https://doi.org/10.3389/fnins.2020.566147},
}
About This Dataset#
Brandl2020
Motor Imagery under distraction dataset from Brandl and Blankertz 2020.
Dataset Overview
Code: Brandl2020 Paradigm: imagery DOI: 10.3389/fnins.2020.566147
View full README
Brandl2020
Motor Imagery under distraction dataset from Brandl and Blankertz 2020.
Dataset Overview
Code: Brandl2020 Paradigm: imagery DOI: 10.3389/fnins.2020.566147 Subjects: 16 Sessions per subject: 1 Events: left_hand=1, right_hand=2 Trial interval: [0, 4.5] s Runs per session: 7 File format: MAT (HDF5 v7.3)
Acquisition
Sampling rate: 1000.0 Hz Number of channels: 63 Channel types: eeg=63 Channel names: AF3, AF4, AF7, AF8, AFz, C1, C2, C3, C4, C5, C6, CP1, CP2, CP3, CP4, CP5, CP6, CPz, Cz, F1, F2, F3, F4, F5, F6, F7, F8, FC1, FC2, FC3, FC4, FC5, FC6, FCz, FT7, FT8, Fp1, Fp2, Fpz, Fz, O1, O2, Oz, P1, P2, P3, P4, P5, P6, P7, P8, PO3, PO4, PO7, PO8, POz, Pz, T7, T8, TP10, TP7, TP8, TP9 Montage: standard_1005 Hardware: 2x BrainAmp (Brain Products) Software: BBCI Toolbox (MATLAB) Reference: nose Sensor type: Ag/AgCl wet Line frequency: 50.0 Hz Cap manufacturer: EasyCap Cap model: Fast’n Easy Cap
Participants
Number of subjects: 16 Health status: healthy Age: mean=26.3 Gender distribution: female=6, male=10 BCI experience: mostly naive (3/16 had prior BCI experience)
Experimental Protocol
Paradigm: imagery Number of classes: 2 Class labels: left_hand, right_hand Trial duration: 4.5 s Tasks: calibration, clean, eyesclosed, news, numbers, flicker, stimulation Study design: Motor imagery under distraction: 1 calibration run (no feedback, no distraction) + 6 feedback runs with different distraction conditions (clean, eyes closed, news, number search, flicker, vibro-tactile stimulation) Feedback type: auditory Stimulus type: auditory Stimulus modalities: auditory Primary modality: auditory Synchronicity: cue-based Mode: online Training/test split: False Instructions: Subjects received auditory cues (‘links’ for left, ‘rechts’ for right) and performed motor imagery of left or right hand movement
HED Event Annotations
Schema: HED 8.4.0 | Browse: https://www.hedtags.org/hed-schema-browser left_hand
├─ Sensory-event, Experimental-stimulus, Visual-presentation
└─ Agent-action
└─ Imagine
├─ Move
└─ Left, Hand
right_hand
├─ Sensory-event, Experimental-stimulus, Visual-presentation
└─ Agent-action
└─ Imagine
├─ Move
└─ Right, Hand
Paradigm-Specific Parameters
Detected paradigm: motor_imagery Imagery tasks: left_hand, right_hand Imagery duration: 4.5 s
Data Structure
Trials: 504 Trials per class: left_hand=252, right_hand=252 Blocks per session: 7 Trials context: 7 runs per subject: 1 calibration (72 trials) + 6 feedback runs (72 trials each, 6 distraction conditions)
Preprocessing
Data state: raw Preprocessing applied: False
Signal Processing
Classifiers: CSP+LDA Feature extraction: CSP, bandpower Frequency bands: mu=[8.0, 13.0] Hz; beta=[13.0, 30.0] Hz Spatial filters: CSP
Cross-Validation
Method: holdout Evaluation type: within_subject
BCI Application
Applications: motor_control Environment: laboratory Online feedback: True
Tags
Pathology: Healthy Modality: Motor Type: Motor Imagery
Documentation
DOI: 10.3389/fnins.2020.566147 License: CC-BY-NC-ND-4.0 Investigators: Stephanie Brandl, Benjamin Blankertz, Tobias Dahne Senior author: Benjamin Blankertz Institution: Technische Universitaet Berlin Department: Department of Neurotechnology Country: DE Repository: DepositOnce TU Berlin Data URL: https://depositonce.tu-berlin.de/handle/11303/10934.2 Publication year: 2020 Funding: BMBF/BIFOLD (01IS18025A, 01IS18037A) Ethics approval: Approved by the ethics committee of the Charite University Medicine Berlin How to acknowledge: Please cite: Brandl, S. and Blankertz, B. (2020). Motor Imagery Under Distraction – An Open Access BCI Dataset. Frontiers in Neuroscience, 14, 566147. https://doi.org/10.3389/fnins.2020.566147 Keywords: brain-computer interface, motor imagery, EEG, distraction, open access, BCI
Abstract
We present an open-access dataset of a motor imagery brain-computer interface (BCI) experiment conducted under six different distraction conditions. Sixteen healthy participants performed left vs. right hand motor imagery while being distracted by flickering video, number search tasks, news listening, eyes closed, vibro-tactile stimulation, or no distraction. Each participant completed one calibration run without feedback and six feedback runs under the different distraction conditions, resulting in 504 trials per subject.
Methodology
Participants completed one session with 7 runs of 72 trials each. Run 1 was calibration (no feedback, no distraction). Runs 2-7 included auditory feedback and one of six distraction conditions. Auditory cues indicated left or right hand imagery. Trial duration was 4.5 s with 2.5 s ITI. Online classification used CSP with LDA. EEG recorded at 1000 Hz with 63 channels, nose reference, using two BrainAmp amplifiers.
References
Brandl, S. and Blankertz, B. (2020). Motor Imagery Under Distraction – An Open Access BCI Dataset. Frontiers in Neuroscience, 14, 566147. https://doi.org/10.3389/fnins.2020.566147 Notes .. versionadded:: 1.2.0 Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Hochenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896 Pernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103. https://doi.org/10.1038/s41597-019-0104-8 Generated by MOABB 1.5.0 (Mother of All BCI Benchmarks) https://github.com/NeuroTechX/moabb
Dataset Information#
Dataset ID |
|
Title |
Brandl2020 |
Author (year) |
|
Canonical |
— |
Importable as |
|
Year |
2020 |
Authors |
Stephanie Brandl, Benjamin Blankertz, Tobias Dahne |
License |
CC-BY-NC-ND-4.0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{nm000329,
title = {Brandl2020},
author = {Stephanie Brandl and Benjamin Blankertz and Tobias Dahne},
doi = {10.3389/fnins.2020.566147},
url = {https://doi.org/10.3389/fnins.2020.566147},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 16
Recordings: 112
Tasks: 1
Channels: 63
Sampling rate (Hz): 1000.0
Duration (hours): 97.11163555555557
Pathology: Healthy
Modality: Auditory
Type: Motor
Size on disk: 61.6 GB
File count: 112
Format: BIDS
License: CC-BY-NC-ND-4.0
DOI: doi:10.3389/fnins.2020.566147
API Reference#
Use the NM000329 class to access this dataset programmatically.
- class eegdash.dataset.NM000329(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetBrandl2020
- Study:
nm000329(NeMAR)- Author (year):
Brandl2020- Canonical:
—
Also importable as:
NM000329,Brandl2020.Modality:
eeg; Experiment type:Motor; Subject type:Healthy. Subjects: 16; recordings: 112; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/nm000329 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=nm000329 DOI: https://doi.org/10.3389/fnins.2020.566147
Examples
>>> from eegdash.dataset import NM000329 >>> dataset = NM000329(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset