NM000268: eeg dataset, 29 subjects#
Shin2017B
Access recordings and metadata through EEGDash.
Citation: Jaeyoung Shin, Alexander von Lühmann, Benjamin Blankertz, Do-Won Kim, Jichai Jeong, Han-Jeong Hwang, Klaus-Robert Müller (2019). Shin2017B. 10.1109/TNSRE.2016.2628057
Modality: eeg Subjects: 29 Recordings: 174 License: GPL-3.0 Source: nemar
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import NM000268
dataset = NM000268(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = NM000268(cache_dir="./data", subject="01")
Advanced query
dataset = NM000268(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{nm000268,
title = {Shin2017B},
author = {Jaeyoung Shin and Alexander von Lühmann and Benjamin Blankertz and Do-Won Kim and Jichai Jeong and Han-Jeong Hwang and Klaus-Robert Müller},
doi = {10.1109/TNSRE.2016.2628057},
url = {https://doi.org/10.1109/TNSRE.2016.2628057},
}
About This Dataset#
Shin2017B
Mental Arithmetic Dataset from Shin et al 2017.
Dataset Overview
Code: Shin2017B Paradigm: imagery DOI: 10.1109/TNSRE.2016.2628057
View full README
Shin2017B
Mental Arithmetic Dataset from Shin et al 2017.
Dataset Overview
Code: Shin2017B Paradigm: imagery DOI: 10.1109/TNSRE.2016.2628057 Subjects: 29 Sessions per subject: 6 Events: left_hand=1, right_hand=2, subtraction=3, rest=4 Trial interval: [0, 10] s Session IDs: 1arithmetic, 3arithmetic, 5arithmetic File format: MATLAB Data preprocessed: True
Acquisition
Sampling rate: 200.0 Hz Number of channels: 30 Channel types: eeg=30, eog=2 Channel names: AFF1h, AFF2h, AFF5h, AFF6h, AFp1, AFp2, CCP3h, CCP4h, CCP5h, CCP6h, Cz, F3, F4, F7, F8, FCC3h, FCC4h, FCC5h, FCC6h, HEOG, P3, P4, P7, P8, POO1, POO2, PPO1h, PPO2h, Pz, T7, T8, VEOG Montage: 10-5 Hardware: BrainAmp Software: MATLAB R2013b Reference: linked mastoids Ground: Fz Sensor type: active electrodes Line frequency: 50.0 Hz Cap manufacturer: EASYCAP GmbH Cap model: custom-made stretchy fabric cap Auxiliary channels: EOG (4 ch, horizontal, vertical), ecg, respiration
Participants
Number of subjects: 29 Health status: healthy Age: mean=28.5, std=3.7 Gender distribution: male=14, female=15 Handedness: {‘right’: 29, ‘left’: 1} BCI experience: naive to MI experiment Species: human
Experimental Protocol
Paradigm: imagery Number of classes: 2 Class labels: subtraction, rest Trial duration: 10.0 s Trials per class: subtraction=30, rest=30 Study design: Dataset B: mental arithmetic (serial subtraction of one-digit number) versus baseline/rest task Feedback type: none Stimulus type: visual instruction (subtraction problem and fixation cross) Stimulus modalities: visual, auditory Primary modality: visual Synchronicity: cued-synchronous Mode: offline Training/test split: False Instructions: For the MA task, subjects memorized an initial subtraction (three-digit minus one-digit) displayed for 2s, then repeatedly subtracted the one-digit number from each result. For baseline, subjects rested with no specific thought.
HED Event Annotations
Schema: HED 8.4.0 | Browse: https://www.hedtags.org/hed-schema-browser left_hand
├─ Sensory-event, Experimental-stimulus, Visual-presentation
└─ Agent-action
└─ Imagine
├─ Move
└─ Left, Hand
right_hand
├─ Sensory-event, Experimental-stimulus, Visual-presentation
└─ Agent-action
└─ Imagine
├─ Move
└─ Right, Hand
subtraction
├─ Sensory-event, Experimental-stimulus, Visual-presentation
└─ Agent-action
└─ Imagine
├─ Think
└─ Label/subtraction
rest
├─ Sensory-event
├─ Experimental-stimulus
├─ Visual-presentation
└─ Rest
Paradigm-Specific Parameters
Detected paradigm: motor_imagery Number of repetitions: 20
Data Structure
Trials: {‘per_session’: 20, ‘per_condition_session’: 10, ‘per_condition_total’: 30} Trials context: Each session: 1 min pre-experiment rest + 20 trials + 1 min post-experiment rest. Trial: 2s visual instruction + 10s task + 15-17s random rest
Preprocessing
Data state: preprocessed Preprocessing applied: True Steps: common average reference, bandpass filtering (0.5-50 Hz), ICA-based EOG rejection, downsampling to 200 Hz Highpass filter: 0.5 Hz Lowpass filter: 50.0 Hz Bandpass filter: [0.5, 50.0] Filter type: Chebyshev type II Filter order: 4 Artifact methods: EOG correction, ICA Re-reference: car Downsampled to: 200.0 Hz
Signal Processing
Classifiers: LDA, Shrinkage LDA Feature extraction: CSP, log-variance Frequency bands: analyzed=[4.0, 35.0] Hz Spatial filters: CSP
Cross-Validation
Method: 10x5-fold Folds: 5 Evaluation type: within_subject
Performance (Original Study)
Ma Eeg Max Accuracy: 75.9 Ma Hbr Max Accuracy: 80.7 Ma Hbo Max Accuracy: 83.6
BCI Application
Applications: hybrid_bci_research Environment: laboratory Online feedback: False
Tags
Pathology: Healthy Modality: Cognitive Type: Cognitive
Documentation
Description: Open access dataset for hybrid brain-computer interfaces using EEG and NIRS with motor imagery and mental arithmetic tasks DOI: 10.1109/TNSRE.2016.2628057 License: GPL-3.0 Investigators: Jaeyoung Shin, Alexander von Lühmann, Benjamin Blankertz, Do-Won Kim, Jichai Jeong, Han-Jeong Hwang, Klaus-Robert Müller Senior author: Klaus-Robert Müller Contact: h2j@kumoh.ac.kr; klaus-robert.mueller@tuberlin.de Institution: Berlin Institute of Technology Department: Department of Computer Science, Machine Learning Group Address: 10587 Berlin, Germany Country: DE Repository: GitHub Data URL: http://doc.ml.tu-berlin.de/hBCI Publication year: 2017 Funding: Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2014R1A6A3A03057524); Ministry of Science, ICT & Future Planning (NRF-2015R1C1A1A02037032); Brain Korea 21 PLUS Program through the NRF funded by the Ministry of Education; Korea University Grant; BMBF (#01GQ0850, Bernstein Focus: Neurotechnology) Ethics approval: Ethics Committee of the Institute of Psychology and Ergonomics, Technical University of Berlin (approval number: SH_01_20150330) Keywords: Brain-computer interface, BCI, electroencephalography, EEG, hybrid BCI, mental arithmetic, motor imagery, near-infrared spectroscopy, NIRS, open access dataset
Abstract
Open access dataset for hybrid brain-computer interfaces using EEG and NIRS. Includes two experiments: (1) left vs right hand motor imagery, (2) mental arithmetic vs resting state. Dataset validated using baseline signal analysis showing hybrid approach enhances discrimination of mental states. Also includes motion artifacts and physiological data for wide range of validation approaches.
Methodology
Thirty subjects performed 6 sessions alternating between motor imagery (dataset A: left/right hand) and mental arithmetic (dataset B: MA vs rest). Each session: 20 trials with 2s cue, 10s task, 15-17s rest. EEG recorded at 1000 Hz with 30 channels, downsampled to 200 Hz. Preprocessing: CAR, 0.5-50 Hz bandpass (4th order Chebyshev II), ICA-based EOG rejection. Feature extraction: CSP with log-variance of first/last 3 components using 3s moving window (1s step). Classification: shrinkage LDA with 10x5-fold CV. Hybrid analysis combines EEG and NIRS outputs using meta-classifier.
References
Shin, J., von Lühmann, A., Blankertz, B., Kim, D.W., Jeong, J., Hwang, H.J. and Müller, K.R., 2017. Open access dataset for EEG+NIRS single-trial classification. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 25(10), pp.1735-1745. GNU General Public License, Version 3 https://www.gnu.org/licenses/gpl-3.0.txt Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Hochenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896 Pernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103. https://doi.org/10.1038/s41597-019-0104-8 Generated by MOABB 1.5.0 (Mother of All BCI Benchmarks) https://github.com/NeuroTechX/moabb
Dataset Information#
Dataset ID |
|
Title |
Shin2017B |
Author (year) |
|
Canonical |
|
Importable as |
|
Year |
2019 |
Authors |
Jaeyoung Shin, Alexander von Lühmann, Benjamin Blankertz, Do-Won Kim, Jichai Jeong, Han-Jeong Hwang, Klaus-Robert Müller |
License |
GPL-3.0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{nm000268,
title = {Shin2017B},
author = {Jaeyoung Shin and Alexander von Lühmann and Benjamin Blankertz and Do-Won Kim and Jichai Jeong and Han-Jeong Hwang and Klaus-Robert Müller},
doi = {10.1109/TNSRE.2016.2628057},
url = {https://doi.org/10.1109/TNSRE.2016.2628057},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 29
Recordings: 174
Tasks: 1
Channels: 32
Sampling rate (Hz): 200.0
Duration (hours): 29.03336944444445
Pathology: Healthy
Modality: Visual
Type: Memory
Size on disk: 1.9 GB
File count: 174
Format: BIDS
License: GPL-3.0
DOI: doi:10.1109/TNSRE.2016.2628057
API Reference#
Use the NM000268 class to access this dataset programmatically.
- class eegdash.dataset.NM000268(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetShin2017B
- Study:
nm000268(NeMAR)- Author (year):
Shin2017_Shin2017B- Canonical:
Shin2017B
Also importable as:
NM000268,Shin2017_Shin2017B,Shin2017B.Modality:
eeg; Experiment type:Memory; Subject type:Healthy. Subjects: 29; recordings: 174; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/nm000268 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=nm000268 DOI: https://doi.org/10.1109/TNSRE.2016.2628057
Examples
>>> from eegdash.dataset import NM000268 >>> dataset = NM000268(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset