NM000115: eeg dataset, 4 subjects#
Zhou2016
Access recordings and metadata through EEGDash.
Citation: Bangyan Zhou, Xiaopei Wu, Zongtan Lv, Lei Zhang, Xiaojin Guo (2016). Zhou2016. 10.82901/nemar.nm000115
Modality: eeg Subjects: 4 Recordings: 24 License: CC-BY-4.0 Source: nemar
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import NM000115
dataset = NM000115(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = NM000115(cache_dir="./data", subject="01")
Advanced query
dataset = NM000115(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{nm000115,
title = {Zhou2016},
author = {Bangyan Zhou and Xiaopei Wu and Zongtan Lv and Lei Zhang and Xiaojin Guo},
doi = {10.82901/nemar.nm000115},
url = {https://doi.org/10.82901/nemar.nm000115},
}
About This Dataset#
DOI README
Introduction
This dataset contains EEG recordings from four subjects performing motor imagery tasks (left hand, right hand, and feet), originally published by Zhou et al. (2016). The data was reformatted into BIDS from its Zenodo version (https://zenodo.org/records/16534752), which was itself generated by MOABB (Mother of All BCI Benchmarks, https://github.com/NeuroTechX/moabb). The original study investigated a fully automated trial selection method for optimization of motor imagery based brain-computer interfaces.
Overview of the experiment
Four participants each completed three recording sessions separated by days to months. Each session contained two consecutive runs with inter-run breaks. Each run comprised 75 trials (25 per class: left hand, right hand, and feet imagery), for a total of 450 trials per subject across all sessions. Trials began with an auditory cue, followed by a 5-second visual arrow stimulus indicating the motor imagery task to perform, then a 4-second rest period. EEG was recorded from 14 channels placed according to the extended 10/20 system (Fp1, Fp2, FC3, FCz, FC4, C3, Cz, C4, CP3, CPz, CP4, O1, Oz, O2) at a sampling frequency of 250 Hz with a 50 Hz power line frequency.
View full README
DOI README
Introduction
This dataset contains EEG recordings from four subjects performing motor imagery tasks (left hand, right hand, and feet), originally published by Zhou et al. (2016). The data was reformatted into BIDS from its Zenodo version (https://zenodo.org/records/16534752), which was itself generated by MOABB (Mother of All BCI Benchmarks, https://github.com/NeuroTechX/moabb). The original study investigated a fully automated trial selection method for optimization of motor imagery based brain-computer interfaces.
Overview of the experiment
Four participants each completed three recording sessions separated by days to months. Each session contained two consecutive runs with inter-run breaks. Each run comprised 75 trials (25 per class: left hand, right hand, and feet imagery), for a total of 450 trials per subject across all sessions. Trials began with an auditory cue, followed by a 5-second visual arrow stimulus indicating the motor imagery task to perform, then a 4-second rest period. EEG was recorded from 14 channels placed according to the extended 10/20 system (Fp1, Fp2, FC3, FCz, FC4, C3, Cz, C4, CP3, CPz, CP4, O1, Oz, O2) at a sampling frequency of 250 Hz with a 50 Hz power line frequency.
Dataset structure
4 subjects (sub-1 through sub-4)
3 sessions per subject (ses-0, ses-1, ses-2)
2 runs per session (run-0, run-1)
24 EEG recordings total in EDF format
14 EEG channels, 250 Hz sampling rate
3 event types: left_hand (value=2), right_hand (value=3), feet (value=1)
Electrode positions in CapTrak coordinate system
Preprocessing
The data distributed here has undergone minimal preprocessing by MOABB prior to BIDS conversion: - Extraction of the 14 EEG channels from the original recordings - Annotation of motor imagery events (left_hand, right_hand, feet) with 5-second durations - Resampling to 250 Hz - Export to EDF format
Original and related datasets
This dataset was reformatted into BIDS from the Zenodo archive at https://zenodo.org/records/16534752. That archive was generated by MOABB v1.2.0 from the original data accompanying the publication. The original study and data are described in: Zhou B, Wu X, Lv Z, Zhang L, Guo X (2016). A Fully Automated Trial Selection Method for Optimization of Motor Imagery Based Brain-Computer Interface. PLoS ONE 11(9): e0162657. https://doi.org/10.1371/journal.pone.0162657
References
Zhou B, Wu X, Lv Z, Zhang L, Guo X (2016). A Fully Automated Trial Selection Method for Optimization of Motor Imagery Based Brain-Computer Interface. PLoS ONE 11(9): e0162657. https://doi.org/10.1371/journal.pone.0162657 Appelhoff S, Sanderson M, Brooks T, et al. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: 1896. https://doi.org/10.21105/joss.01896 Pernet CR, Appelhoff S, Gorgolewski KJ, et al. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103. https://doi.org/10.1038/s41597-019-0104-8 Data curator for NEMAR version: Arnaud Delorme (UCSD, La Jolla, CA, USA)
Dataset Information#
Dataset ID |
|
Title |
Zhou2016 |
Author (year) |
|
Canonical |
— |
Importable as |
|
Year |
2016 |
Authors |
Bangyan Zhou, Xiaopei Wu, Zongtan Lv, Lei Zhang, Xiaojin Guo |
License |
CC-BY-4.0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{nm000115,
title = {Zhou2016},
author = {Bangyan Zhou and Xiaopei Wu and Zongtan Lv and Lei Zhang and Xiaojin Guo},
doi = {10.82901/nemar.nm000115},
url = {https://doi.org/10.82901/nemar.nm000115},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 4
Recordings: 24
Tasks: 1
Channels: 14
Sampling rate (Hz): 250
Duration (hours): 6.268284444444444
Pathology: Not specified
Modality: —
Type: —
Size on disk: 152.1 MB
File count: 24
Format: BIDS
License: CC-BY-4.0
DOI: 10.82901/nemar.nm000115
API Reference#
Use the NM000115 class to access this dataset programmatically.
- class eegdash.dataset.NM000115(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetZhou2016
- Study:
nm000115(NeMAR)- Author (year):
Zhou2016- Canonical:
—
Also importable as:
NM000115,Zhou2016.Modality:
eeg. Subjects: 4; recordings: 24; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/nm000115 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=nm000115 DOI: https://doi.org/10.82901/nemar.nm000115
Examples
>>> from eegdash.dataset import NM000115 >>> dataset = NM000115(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset