NM000201: eeg dataset, 24 subjects#
ERP paradigm of the Mobile BCI dataset
Access recordings and metadata through EEGDash.
Citation: Young-Eun Lee, Gi-Hwan Shin, Minji Lee, Seong-Whan Lee (2019). ERP paradigm of the Mobile BCI dataset.
Modality: eeg Subjects: 24 Recordings: 113 License: CC BY 4.0 Source: nemar
Metadata: Complete (90%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import NM000201
dataset = NM000201(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = NM000201(cache_dir="./data", subject="01")
Advanced query
dataset = NM000201(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{nm000201,
title = {ERP paradigm of the Mobile BCI dataset},
author = {Young-Eun Lee and Gi-Hwan Shin and Minji Lee and Seong-Whan Lee},
}
About This Dataset#
ERP paradigm of the Mobile BCI dataset
ERP paradigm of the Mobile BCI dataset.
Dataset Overview
Code: Lee2021Mobile-ERP
Paradigm: p300
DOI: 10.1038/s41597-021-01094-4
View full README
ERP paradigm of the Mobile BCI dataset
ERP paradigm of the Mobile BCI dataset.
Dataset Overview
Code: Lee2021Mobile-ERP
Paradigm: p300
DOI: 10.1038/s41597-021-01094-4
Subjects: 24
Sessions per subject: 5
Events: Target=2, NonTarget=1
Trial interval: [0, 1.0] s
File format: BrainVision
Acquisition
Sampling rate: 100.0 Hz
Number of channels: 73
Channel types: eeg=73
Montage: standard_1005
Hardware: BrainAmp (Brain Product GmbH)
Reference: FCz
Ground: Fpz
Sensor type: Ag/AgCl
Line frequency: 60.0 Hz
Impedance threshold: 50 kOhm
Electrode material: Ag/AgCl
Auxiliary channels: EOG (4 ch, vertical, horizontal)
Participants
Number of subjects: 24
Health status: healthy
Age: mean=24.5, std=2.9, min=19, max=32
Gender distribution: male=14, female=10
Experimental Protocol
Paradigm: p300
Number of classes: 2
Class labels: Target, NonTarget
Trial duration: 1.0 s
Study design: BCI during motion (standing/walking/running)
Stimulus type: visual oddball
Stimulus modalities: visual
Primary modality: visual
Synchronicity: synchronous
Mode: offline
HED Event Annotations
Schema: HED 8.4.0 | Browse: https://www.hedtags.org/hed-schema-browser
Target
├─ Sensory-event
├─ Experimental-stimulus
├─ Visual-presentation
└─ Target
NonTarget
├─ Sensory-event
├─ Experimental-stimulus
├─ Visual-presentation
└─ Non-target
BCI Application
Environment: mobile
Online feedback: False
Tags
Pathology: healthy
Modality: visual
Type: perception
Documentation
DOI: 10.1038/s41597-021-01094-4
License: CC BY 4.0
Investigators: Young-Eun Lee, Gi-Hwan Shin, Minji Lee, Seong-Whan Lee
Senior author: Seong-Whan Lee
Institution: Korea University
Country: KR
Repository: OSF
Data URL: https://osf.io/r7s9b/
Publication year: 2021
Funding: IITP No. 2017-0-00451; IITP No. 2015-0-00185; IITP No. 2019-0-00079
Ethics approval: Institutional Review Board of Korea University, KUIRB-2019-0194-01
Keywords: SSVEP, ERP, mobile BCI, ear-EEG, locomotion
References
Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Hochenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896 Pernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103. https://doi.org/10.1038/s41597-019-0104-8 Generated by MOABB 1.5.0 (Mother of All BCI Benchmarks) https://github.com/NeuroTechX/moabb
Dataset Information#
Dataset ID |
|
Title |
ERP paradigm of the Mobile BCI dataset |
Author (year) |
|
Canonical |
— |
Importable as |
|
Year |
2019 |
Authors |
Young-Eun Lee, Gi-Hwan Shin, Minji Lee, Seong-Whan Lee |
License |
CC BY 4.0 |
Citation / DOI |
Unknown |
Source links |
OpenNeuro | NeMAR | Source URL |
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 24
Recordings: 113
Tasks: 1
Channels: 48 (108), 73 (5)
Sampling rate (Hz): 500.0 (108), 100.0 (5)
Duration (hours): 22.13410722222222
Pathology: Healthy
Modality: Visual
Type: Attention
Size on disk: 5.2 GB
File count: 113
Format: BIDS
License: CC BY 4.0
DOI: —
API Reference#
Use the NM000201 class to access this dataset programmatically.
- class eegdash.dataset.NM000201(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetERP paradigm of the Mobile BCI dataset
- Study:
nm000201(NeMAR)- Author (year):
Lee2021_ERP- Canonical:
—
Also importable as:
NM000201,Lee2021_ERP.Modality:
eeg; Experiment type:Attention; Subject type:Healthy. Subjects: 24; recordings: 113; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/nm000201 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=nm000201
Examples
>>> from eegdash.dataset import NM000201 >>> dataset = NM000201(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset