NM000145: eeg dataset, 10 subjects#
Munich Motor Imagery dataset
Access recordings and metadata through EEGDash.
Citation: Moritz Grosse-Wentrup, Christian Liefhold, Klaus Gramann, Martin Buss (2009). Munich Motor Imagery dataset.
Modality: eeg Subjects: 10 Recordings: 10 License: CC-BY-4.0 Source: nemar
Metadata: Complete (90%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import NM000145
dataset = NM000145(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = NM000145(cache_dir="./data", subject="01")
Advanced query
dataset = NM000145(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{nm000145,
title = {Munich Motor Imagery dataset},
author = {Moritz Grosse-Wentrup and Christian Liefhold and Klaus Gramann and Martin Buss},
}
About This Dataset#
Munich Motor Imagery dataset
Munich Motor Imagery dataset.
Dataset Overview
Code: GrosseWentrup2009
Paradigm: imagery
DOI: 10.1109/TBME.2008.2009768
View full README
Munich Motor Imagery dataset
Munich Motor Imagery dataset.
Dataset Overview
Code: GrosseWentrup2009
Paradigm: imagery
DOI: 10.1109/TBME.2008.2009768
Subjects: 10
Sessions per subject: 1
Events: right_hand=2, left_hand=1
Trial interval: [0, 7] s
File format: set
Data preprocessed: True
Acquisition
Sampling rate: 500.0 Hz
Number of channels: 128
Channel types: eeg=128
Channel names: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128
Montage: standard_1020
Hardware: BrainAmp
Reference: Cz
Line frequency: 50.0 Hz
Online filters: {‘highpass_time_constant_s’: 10}
Impedance threshold: 10 kOhm
Participants
Number of subjects: 10
Health status: healthy
Age: mean=25.6, std=2.5
Gender distribution: male=8, female=2
Handedness: {‘right’: 8}
BCI experience: mixed
Species: human
Experimental Protocol
Paradigm: imagery
Task type: motor_imagery
Number of classes: 2
Class labels: right_hand, left_hand
Trial duration: 10 s
Tasks: motor_imagery
Study design: two-class motor imagery with arrow cues
Feedback type: none
Stimulus type: arrow_cue
Stimulus modalities: visual
Primary modality: visual
Synchronicity: synchronous
Mode: offline
Instructions: Subjects were instructed to perform haptic motor imagery of the left or the right hand during display of the arrow, as indicated by the direction of the arrow
HED Event Annotations
Schema: HED 8.4.0 | Browse: https://www.hedtags.org/hed-schema-browser
right_hand
├─ Sensory-event
│ ├─ Experimental-stimulus
│ ├─ Visual-presentation
│ └─ Rightward, Arrow
└─ Agent-action
└─ Imagine
├─ Move
└─ Right, Hand
left_hand
├─ Sensory-event
│ ├─ Experimental-stimulus
│ ├─ Visual-presentation
│ └─ Leftward, Arrow
└─ Agent-action
└─ Imagine
├─ Move
└─ Left, Hand
Paradigm-Specific Parameters
Detected paradigm: motor_imagery
Imagery tasks: left_hand, right_hand
Cue duration: 7.0 s
Imagery duration: 7.0 s
Data Structure
Trials: 150
Trials context: per_class
Preprocessing
Data state: preprocessed
Preprocessing applied: True
Artifact methods: none
Re-reference: car
Notes: No trials were rejected and no artifact correction was performed. Data were re-referenced to common average reference offline.
Signal Processing
Classifiers: Logistic Regression
Feature extraction: CSP, Beamforming, Laplacian, Bandpower
Frequency bands: analyzed=[7.0, 30.0] Hz
Spatial filters: CSP, Beamforming, Laplacian
Cross-Validation
Method: bootstrapping
Evaluation type: within_subject
BCI Application
Applications: motor_control
Environment: shielded_room
Online feedback: False
Tags
Pathology: Healthy
Modality: Motor
Type: Motor
Documentation
DOI: 10.1109/TBME.2008.2009768
License: CC-BY-4.0
Investigators: Moritz Grosse-Wentrup, Christian Liefhold, Klaus Gramann, Martin Buss
Senior author: Martin Buss
Contact: moritzgw@ieee.org
Institution: Technische Universität München
Department: Institute of Automatic Control Engineering (LSR)
Country: DE
Repository: Zenodo
Publication year: 2009
Keywords: Beamforming, brain-computer interfaces, common spatial patterns, electroencephalography, motor imagery, spatial filtering
References
Grosse-Wentrup, Moritz, et al. “Beamforming in noninvasive brain–computer interfaces.” IEEE Transactions on Biomedical Engineering 56.4 (2009): 1209-1219. Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Hochenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896 Pernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103. https://doi.org/10.1038/s41597-019-0104-8 Generated by MOABB 1.5.0 (Mother of All BCI Benchmarks) https://github.com/NeuroTechX/moabb
Dataset Information#
Dataset ID |
|
Title |
Munich Motor Imagery dataset |
Author (year) |
|
Canonical |
— |
Importable as |
|
Year |
2009 |
Authors |
Moritz Grosse-Wentrup, Christian Liefhold, Klaus Gramann, Martin Buss |
License |
CC-BY-4.0 |
Citation / DOI |
Unknown |
Source links |
OpenNeuro | NeMAR | Source URL |
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 10
Recordings: 10
Tasks: 1
Channels: 128
Sampling rate (Hz): 500.0
Duration (hours): 8.404805555555555
Pathology: Healthy
Modality: Visual
Type: Motor
Size on disk: 5.4 GB
File count: 10
Format: BIDS
License: CC-BY-4.0
DOI: —
API Reference#
Use the NM000145 class to access this dataset programmatically.
- class eegdash.dataset.NM000145(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetMunich Motor Imagery dataset
- Study:
nm000145(NeMAR)- Author (year):
GrosseWentrup2009- Canonical:
—
Also importable as:
NM000145,GrosseWentrup2009.Modality:
eeg; Experiment type:Motor; Subject type:Healthy. Subjects: 10; recordings: 10; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/nm000145 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=nm000145
Examples
>>> from eegdash.dataset import NM000145 >>> dataset = NM000145(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset