NM000140: eeg dataset, 12 subjects#

BNCI 2015-001 Motor Imagery dataset

Access recordings and metadata through EEGDash.

Citation: Josef Faller, Carmen Vidaurre, Teodoro Solis-Escalante, Christa Neuper, Reinhold Scherer (2012). BNCI 2015-001 Motor Imagery dataset.

Modality: eeg Subjects: 12 Recordings: 28 License: CC-BY-NC-ND-4.0 Source: nemar

Metadata: Complete (90%)

Quickstart#

Install

pip install eegdash

Access the data

from eegdash.dataset import NM000140

dataset = NM000140(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)

Filter by subject

dataset = NM000140(cache_dir="./data", subject="01")

Advanced query

dataset = NM000140(
    cache_dir="./data",
    query={"subject": {"$in": ["01", "02"]}},
)

Iterate recordings

for rec in dataset:
    print(rec.subject, rec.raw.info['sfreq'])

If you use this dataset in your research, please cite the original authors.

BibTeX

@dataset{nm000140,
  title = {BNCI 2015-001 Motor Imagery dataset},
  author = {Josef Faller and Carmen Vidaurre and Teodoro Solis-Escalante and Christa Neuper and Reinhold Scherer},
}

About This Dataset#

BNCI 2015-001 Motor Imagery dataset

BNCI 2015-001 Motor Imagery dataset.

Dataset Overview

  • Code: BNCI2015-001

  • Paradigm: imagery

  • DOI: 10.1109/tnsre.2012.2189584

View full README

BNCI 2015-001 Motor Imagery dataset

BNCI 2015-001 Motor Imagery dataset.

Dataset Overview

  • Code: BNCI2015-001

  • Paradigm: imagery

  • DOI: 10.1109/tnsre.2012.2189584

  • Subjects: 12

  • Sessions per subject: 2

  • Events: right_hand=1, feet=2

  • Trial interval: [0, 5] s

  • File format: gdf

  • Data preprocessed: True

Acquisition

  • Sampling rate: 512.0 Hz

  • Number of channels: 13

  • Channel types: eeg=13

  • Channel names: FC3, FCz, FC4, C5, C3, C1, Cz, C2, C4, C6, CP3, CPz, CP4

  • Montage: 10-20

  • Hardware: g.tec

  • Software: Matlab

  • Reference: Car

  • Sensor type: active electrode

  • Line frequency: 50.0 Hz

  • Online filters: 50 Hz notch

  • Cap manufacturer: g.tec

  • Cap model: g.GAMMAsys

  • Auxiliary channels: gsr

Participants

  • Number of subjects: 12

  • Health status: healthy

  • Age: mean=24.8

  • Gender distribution: male=7, female=5

  • Handedness: all right-handed

  • BCI experience: naive

  • Species: human

Experimental Protocol

  • Paradigm: imagery

  • Number of classes: 2

  • Class labels: right_hand, feet

  • Trial duration: 11.0 s

  • Study design: Two-class motor imagery: sustained right hand movement imagery (palmar grip) versus both feet movement imagery (plantar extension)

  • Feedback type: visual

  • Stimulus type: cursor_feedback

  • Stimulus modalities: visual, auditory

  • Primary modality: visual

  • Synchronicity: synchronous

  • Mode: training

  • Instructions: Relax during reference period (3s), perform sustained kinesthetic movement imagery during activity period. Condition 1 (arrow right): imagine palmar grip with right hand. Condition 2 (arrow down): imagine plantar extension of both feet.

HED Event Annotations

Schema: HED 8.4.0 | Browse: https://www.hedtags.org/hed-schema-browser

right_hand
     ├─ Sensory-event, Experimental-stimulus, Visual-presentation
     └─ Agent-action
        └─ Imagine
           ├─ Move
           └─ Right, Hand

feet
├─ Sensory-event, Experimental-stimulus, Visual-presentation
└─ Agent-action
   └─ Imagine, Move, Foot

Paradigm-Specific Parameters

  • Detected paradigm: motor_imagery

  • Imagery tasks: right_hand_palmar_grip, both_feet_plantar_extension

  • Cue duration: 1.25 s

  • Imagery duration: 4.0 s

Data Structure

  • Trials: 200

  • Trials per class: right_hand=100, feet=100

  • Trials context: per_session

Preprocessing

  • Data state: filtered

  • Preprocessing applied: True

  • Steps: bandpass filter, notch filter

  • Highpass filter: 0.5 Hz

  • Lowpass filter: 100.0 Hz

  • Bandpass filter: {‘low_cutoff_hz’: 0.5, ‘high_cutoff_hz’: 100.0}

  • Notch filter: [50.0] Hz

  • Re-reference: car

Signal Processing

  • Classifiers: LDA

  • Feature extraction: logarithmic bandpower, CSP

  • Frequency bands: alpha=[10, 13] Hz; beta=[16, 24] Hz

Cross-Validation

  • Method: leave-one-out

  • Evaluation type: cross_session

Performance (Original Study)

  • Accuracy: 80.0%

BCI Application

  • Applications: communication, control

  • Online feedback: True

Tags

  • Pathology: Healthy

  • Modality: Motor

  • Type: Motor

Documentation

  • DOI: 10.1109/tnsre.2012.2189584

  • License: CC-BY-NC-ND-4.0

  • Investigators: Josef Faller, Carmen Vidaurre, Teodoro Solis-Escalante, Christa Neuper, Reinhold Scherer

  • Senior author: Reinhold Scherer

  • Contact: josef.faller@tugraz.at; christa.neuper@uni-graz.at; carmen.vidaurre@tu-berlin.de

  • Institution: Graz University of Technology

  • Department: Institute of Knowledge Discovery

  • Address: 8010 Graz, Austria

  • Country: Austria

  • Repository: BNCI Horizon

  • Publication year: 2012

  • Funding: FP7 Framework EU Research Project BrainAble (No. 247447)

References

Faller, J., Vidaurre, C., Solis-Escalante, T., Neuper, C., & Scherer, R. (2012). Autocalibration and recurrent adaptation: Towards a plug and play online ERD-BCI. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 20(3), 313-319. https://doi.org/10.1109/tnsre.2012.2189584 Notes .. note:: BNCI2015_001 was previously named BNCI2015001. BNCI2015001 will be removed in version 1.1. .. versionadded:: 0.4.0 Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Hochenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896 Pernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103. https://doi.org/10.1038/s41597-019-0104-8 Generated by MOABB 1.4.3 (Mother of All BCI Benchmarks) https://github.com/NeuroTechX/moabb

Dataset Information#

Dataset ID

NM000140

Title

BNCI 2015-001 Motor Imagery dataset

Author (year)

Faller2015

Canonical

BNCI2015, BNCI2015001

Importable as

NM000140, Faller2015, BNCI2015, BNCI2015001

Year

2012

Authors

Josef Faller, Carmen Vidaurre, Teodoro Solis-Escalante, Christa Neuper, Reinhold Scherer

License

CC-BY-NC-ND-4.0

Citation / DOI

Unknown

Source links

OpenNeuro | NeMAR | Source URL

Found an issue with this dataset?

If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!

Report an Issue on GitHub

Technical Details#

Subjects & recordings
  • Subjects: 12

  • Recordings: 28

  • Tasks: 1

Channels & sampling rate
  • Channels: 13

  • Sampling rate (Hz): 512.0

  • Duration (hours): 16.68931640625

Tags
  • Pathology: Healthy

  • Modality: Visual

  • Type: Motor

Files & format
  • Size on disk: 1.1 GB

  • File count: 28

  • Format: BIDS

License & citation
  • License: CC-BY-NC-ND-4.0

  • DOI: —

Provenance

API Reference#

Use the NM000140 class to access this dataset programmatically.

class eegdash.dataset.NM000140(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#

Bases: EEGDashDataset

BNCI 2015-001 Motor Imagery dataset

Study:

nm000140 (NeMAR)

Author (year):

Faller2015

Canonical:

BNCI2015, BNCI2015001

Also importable as: NM000140, Faller2015, BNCI2015, BNCI2015001.

Modality: eeg; Experiment type: Motor; Subject type: Healthy. Subjects: 12; recordings: 28; tasks: 1.

Parameters:
  • cache_dir (str | Path) – Directory where data are cached locally.

  • query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key dataset.

  • s3_bucket (str | None) – Base S3 bucket used to locate the data.

  • **kwargs (dict) – Additional keyword arguments forwarded to EEGDashDataset.

data_dir#

Local dataset cache directory (cache_dir / dataset_id).

Type:

Path

query#

Merged query with the dataset filter applied.

Type:

dict

records#

Metadata records used to build the dataset, if pre-fetched.

Type:

list[dict] | None

Notes

Each item is a recording; recording-level metadata are available via dataset.description. query supports MongoDB-style filters on fields in ALLOWED_QUERY_FIELDS and is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.

References

OpenNeuro dataset: https://openneuro.org/datasets/nm000140 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=nm000140

Examples

>>> from eegdash.dataset import NM000140
>>> dataset = NM000140(cache_dir="./data")
>>> recording = dataset[0]
>>> raw = recording.load()
__init__(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
save(path, overwrite=False)[source]#

Save the dataset to disk.

Parameters:
  • path (str or Path) – Destination file path.

  • overwrite (bool, default False) – If True, overwrite existing file.

Return type:

None

See Also#