NM000267: eeg dataset, 29 subjects#

Shin2017A

Access recordings and metadata through EEGDash.

Citation: Jaeyoung Shin, Alexander von Lühmann, Benjamin Blankertz, Do-Won Kim, Jichai Jeong, Han-Jeong Hwang, Klaus-Robert Müller (2019). Shin2017A. 10.1109/TNSRE.2016.2628057

Modality: eeg Subjects: 29 Recordings: 174 License: GPL-3.0 Source: nemar

Metadata: Complete (100%)

Quickstart#

Install

pip install eegdash

Access the data

from eegdash.dataset import NM000267

dataset = NM000267(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)

Filter by subject

dataset = NM000267(cache_dir="./data", subject="01")

Advanced query

dataset = NM000267(
    cache_dir="./data",
    query={"subject": {"$in": ["01", "02"]}},
)

Iterate recordings

for rec in dataset:
    print(rec.subject, rec.raw.info['sfreq'])

If you use this dataset in your research, please cite the original authors.

BibTeX

@dataset{nm000267,
  title = {Shin2017A},
  author = {Jaeyoung Shin and Alexander von Lühmann and Benjamin Blankertz and Do-Won Kim and Jichai Jeong and Han-Jeong Hwang and Klaus-Robert Müller},
  doi = {10.1109/TNSRE.2016.2628057},
  url = {https://doi.org/10.1109/TNSRE.2016.2628057},
}

About This Dataset#

Shin2017A

Motor Imagey Dataset from Shin et al 2017.

Dataset Overview

Code: Shin2017A Paradigm: imagery DOI: 10.1109/TNSRE.2016.2628057

View full README

Shin2017A

Motor Imagey Dataset from Shin et al 2017.

Dataset Overview

Code: Shin2017A Paradigm: imagery DOI: 10.1109/TNSRE.2016.2628057 Subjects: 29 Sessions per subject: 6 Events: left_hand=1, right_hand=2, subtraction=3, rest=4 Trial interval: [0, 10] s File format: MATLAB Data preprocessed: True

Acquisition

Sampling rate: 200.0 Hz Number of channels: 30 Channel types: eeg=30, eog=2 Channel names: AFF1h, AFF2h, AFF5h, AFF6h, AFp1, AFp2, CCP3h, CCP4h, CCP5h, CCP6h, Cz, F3, F4, F7, F8, FCC3h, FCC4h, FCC5h, FCC6h, HEOG, P3, P4, P7, P8, POO1, POO2, PPO1h, PPO2h, Pz, T7, T8, VEOG Montage: 10-5 Hardware: BrainAmp Reference: linked mastoids Ground: Fz Sensor type: active electrodes Line frequency: 50.0 Hz Cap manufacturer: EASYCAP GmbH Cap model: custom-made stretchy fabric cap Auxiliary channels: EOG (4 ch, horizontal, vertical), ecg, respiration

Participants

Number of subjects: 29 Health status: healthy Age: mean=28.5, std=3.7 Gender distribution: male=14, female=15 Handedness: {‘right’: 29, ‘left’: 1} BCI experience: naive to MI experiment Species: human

Experimental Protocol

Paradigm: imagery Number of classes: 2 Class labels: left_hand, right_hand Trial duration: 10.0 s Study design: Dataset A: left vs right hand motor imagery (kinesthetic imagery of opening and closing hands) Feedback type: none Stimulus type: visual arrow and fixation cross Stimulus modalities: visual, auditory Primary modality: visual Synchronicity: cued Mode: offline Instructions: Subjects were instructed to perform kinesthetic MI (i.e., to imagine the opening and closing their hands as they were grabbing a ball) to ensure that actual MI, not visual MI, was performed. Subjects were asked to imagine hand gripping (opening and closing their hands) with a 1 Hz pace.

HED Event Annotations

Schema: HED 8.4.0 | Browse: https://www.hedtags.org/hed-schema-browser left_hand

     ├─ Sensory-event
     │  ├─ Experimental-stimulus
     │  ├─ Visual-presentation
     │  └─ Leftward, Arrow
     └─ Agent-action
        └─ Imagine
           ├─ Move
           └─ Left, Hand

right_hand
     ├─ Sensory-event
     │  ├─ Experimental-stimulus
     │  ├─ Visual-presentation
     │  └─ Rightward, Arrow
     └─ Agent-action
        └─ Imagine
           ├─ Move
           └─ Right, Hand

subtraction
     ├─ Sensory-event, Experimental-stimulus, Visual-presentation
     └─ Agent-action
        └─ Imagine
           ├─ Think
           └─ Label/subtraction

rest
├─ Sensory-event
├─ Experimental-stimulus
├─ Visual-presentation
└─ Rest

Paradigm-Specific Parameters

Detected paradigm: motor_imagery Number of repetitions: 20 Imagery tasks: left_hand, right_hand Cue duration: 2.0 s Imagery duration: 10.0 s

Data Structure

Trials: {‘per_session’: 20, ‘per_class_per_session’: 10, ‘total_per_class’: 30} Blocks per session: 10 Trials context: 10 blocks per session, each block containing 2 trials (one left, one right hand MI) randomized

Preprocessing

Data state: preprocessed Preprocessing applied: True Steps: common average reference, bandpass filtering (0.5-50 Hz), ICA-based EOG rejection, downsampling to 200 Hz Highpass filter: 0.5 Hz Lowpass filter: 50.0 Hz Bandpass filter: [0.5, 50.0] Filter type: Chebyshev type II Filter order: 4 Artifact methods: ICA, EOG rejection Re-reference: car Downsampled to: 200.0 Hz

Signal Processing

Classifiers: Shrinkage LDA Feature extraction: CSP, log-variance Frequency bands: mu=[8.0, 12.0] Hz; beta=[12.0, 25.0] Hz; analyzed=[8.0, 25.0] Hz Spatial filters: CSP

Cross-Validation

Method: 10x5-fold Folds: 5 Evaluation type: within_subject

Performance (Original Study)

Accuracy: 65.6% Eeg Accuracy: 65.6 Hbr Accuracy: 66.5 Hbo Accuracy: 63.5 Eeg+Hbr+Hbo Accuracy: 74.2

BCI Application

Applications: motor_control Environment: laboratory Online feedback: False

Tags

Pathology: Healthy Modality: Motor Type: Imagery

Documentation

Description: Open access dataset for hybrid brain-computer interfaces (BCIs) using electroencephalography (EEG) and near-infrared spectroscopy (NIRS). Dataset includes two BCI experiments: left versus right hand motor imagery, and mental arithmetic versus resting state. DOI: 10.1109/TNSRE.2016.2628057 License: GPL-3.0 Investigators: Jaeyoung Shin, Alexander von Lühmann, Benjamin Blankertz, Do-Won Kim, Jichai Jeong, Han-Jeong Hwang, Klaus-Robert Müller Senior author: Klaus-Robert Müller Contact: h2j@kumoh.ac.kr; klaus-robert.mueller@tuberlin.de Institution: Berlin Institute of Technology Department: Machine Learning Group, Department of Computer Science Address: 10587 Berlin, Germany Country: DE Repository: GitHub Data URL: http://doc.ml.tu-berlin.de/hBCI Publication year: 2017 Funding: Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF2014R1A6A3A03057524); Ministry of Science, ICT & Future Planning (NRF-2015R1C1A1A02037032); Brain Korea 21 PLUS Program through the NRF funded by the Ministry of Education; Korea University Grant; BMBF (#01GQ0850, Bernstein Focus: Neurotechnology) Ethics approval: Ethics Committee of the Institute of Psychology and Ergonomics, Technical University of Berlin (approval number: SH_01_20150330); Declaration of Helsinki Keywords: Brain-computer interface (BCI), electroencephalography (EEG), hybrid BCI, mental arithmetic, motor imagery, near-infrared spectroscopy (NIRS), open access dataset

Abstract

We provide an open access dataset for hybrid brain-computer interfaces (BCIs) using electroencephalography (EEG) and near-infrared spectroscopy (NIRS). For this, we conducted two BCI experiments (left versus right hand motor imagery; mental arithmetic versus resting state). The dataset was validated using baseline signal analysis methods, with which classification performance was evaluated for each modality and a combination of both modalities. As already shown in previous literature, the capability of discriminating different mental states can be enhanced by using a hybrid approach, when comparing to single modality analyses. This makes the provided data highly suitable for hybrid BCI investigations. Since our open access dataset also comprises motion artifacts and physiological data, we expect that it can be used in a wide range of future validation approaches in multimodal BCI research.

Methodology

Twenty-nine right-handed and one left-handed healthy subjects participated in motor imagery and mental arithmetic tasks. EEG data was recorded at 1000 Hz using 30 active electrodes with a BrainAmp amplifier, referenced to linked mastoids. NIRS data was collected at 12.5 Hz using NIRScout with 14 sources and 16 detectors resulting in 36 channels. Three sessions were conducted for each paradigm (MI and MA). Each session included 20 trials with 10s task periods and 15-17s rest periods. For MI, subjects performed kinesthetic hand gripping imagery at 1 Hz pace. Visual instructions included arrows for MI and arithmetic problems for MA. Motion artifacts from eye/head movements were also recorded. Signal processing included CSP for spatial filtering, log-variance features, and shrinkage LDA classifier with 10x5-fold cross-validation.

References

Shin, J., von Lühmann, A., Blankertz, B., Kim, D.W., Jeong, J., Hwang, H.J. and Müller, K.R., 2017. Open access dataset for EEG+NIRS single-trial classification. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 25(10), pp.1735-1745. GNU General Public License, Version 3 https://www.gnu.org/licenses/gpl-3.0.txt Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Hochenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896 Pernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103. https://doi.org/10.1038/s41597-019-0104-8 Generated by MOABB 1.5.0 (Mother of All BCI Benchmarks) https://github.com/NeuroTechX/moabb

Dataset Information#

Dataset ID

NM000267

Title

Shin2017A

Author (year)

Shin2017_Shin2017A

Canonical

Shin2017A

Importable as

NM000267, Shin2017_Shin2017A, Shin2017A

Year

2019

Authors

Jaeyoung Shin, Alexander von Lühmann, Benjamin Blankertz, Do-Won Kim, Jichai Jeong, Han-Jeong Hwang, Klaus-Robert Müller

License

GPL-3.0

Citation / DOI

doi:10.1109/TNSRE.2016.2628057

Source links

OpenNeuro | NeMAR | Source URL

Copy-paste BibTeX
@dataset{nm000267,
  title = {Shin2017A},
  author = {Jaeyoung Shin and Alexander von Lühmann and Benjamin Blankertz and Do-Won Kim and Jichai Jeong and Han-Jeong Hwang and Klaus-Robert Müller},
  doi = {10.1109/TNSRE.2016.2628057},
  url = {https://doi.org/10.1109/TNSRE.2016.2628057},
}

Found an issue with this dataset?

If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!

Report an Issue on GitHub

Technical Details#

Subjects & recordings
  • Subjects: 29

  • Recordings: 174

  • Tasks: 1

Channels & sampling rate
  • Channels: 32

  • Sampling rate (Hz): 200.0

  • Duration (hours): 29.03336944444445

Tags
  • Pathology: Healthy

  • Modality: Visual

  • Type: Motor

Files & format
  • Size on disk: 1.9 GB

  • File count: 174

  • Format: BIDS

License & citation
  • License: GPL-3.0

  • DOI: doi:10.1109/TNSRE.2016.2628057

Provenance

API Reference#

Use the NM000267 class to access this dataset programmatically.

class eegdash.dataset.NM000267(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#

Bases: EEGDashDataset

Shin2017A

Study:

nm000267 (NeMAR)

Author (year):

Shin2017_Shin2017A

Canonical:

Shin2017A

Also importable as: NM000267, Shin2017_Shin2017A, Shin2017A.

Modality: eeg; Experiment type: Motor; Subject type: Healthy. Subjects: 29; recordings: 174; tasks: 1.

Parameters:
  • cache_dir (str | Path) – Directory where data are cached locally.

  • query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key dataset.

  • s3_bucket (str | None) – Base S3 bucket used to locate the data.

  • **kwargs (dict) – Additional keyword arguments forwarded to EEGDashDataset.

data_dir#

Local dataset cache directory (cache_dir / dataset_id).

Type:

Path

query#

Merged query with the dataset filter applied.

Type:

dict

records#

Metadata records used to build the dataset, if pre-fetched.

Type:

list[dict] | None

Notes

Each item is a recording; recording-level metadata are available via dataset.description. query supports MongoDB-style filters on fields in ALLOWED_QUERY_FIELDS and is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.

References

OpenNeuro dataset: https://openneuro.org/datasets/nm000267 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=nm000267 DOI: https://doi.org/10.1109/TNSRE.2016.2628057

Examples

>>> from eegdash.dataset import NM000267
>>> dataset = NM000267(cache_dir="./data")
>>> recording = dataset[0]
>>> raw = recording.load()
__init__(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
save(path, overwrite=False)[source]#

Save the dataset to disk.

Parameters:
  • path (str or Path) – Destination file path.

  • overwrite (bool, default False) – If True, overwrite existing file.

Return type:

None

See Also#