DS005342#
EEG data offline and online during motor imagery for standing and sitting
Access recordings and metadata through EEGDash.
Citation: Nayid Triana-Guzman, Alvaro D Orjuela-Cañon, Andres L Jutinico, Omar Mendoza-Montoya, Javier M Antelis (2024). EEG data offline and online during motor imagery for standing and sitting. 10.18112/openneuro.ds005342.v1.0.3
Modality: eeg Subjects: 32 Recordings: 134 License: CC0 Source: openneuro Citations: 1.0
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import DS005342
dataset = DS005342(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = DS005342(cache_dir="./data", subject="01")
Advanced query
dataset = DS005342(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{ds005342,
title = {EEG data offline and online during motor imagery for standing and sitting},
author = {Nayid Triana-Guzman and Alvaro D Orjuela-Cañon and Andres L Jutinico and Omar Mendoza-Montoya and Javier M Antelis},
doi = {10.18112/openneuro.ds005342.v1.0.3},
url = {https://doi.org/10.18112/openneuro.ds005342.v1.0.3},
}
About This Dataset#
The experiments were conducted in an acoustically isolated room where only the participant and the experimenter were present. Participants voluntarily signed an informed consent form in accordance with the experimental protocol approved by the ethics committee of the Universidad Antonio Nariño. The participant was seated in a chair in a posture that was comfortable for him/her but did not affect data collection. In front of the participant, a 40-inch TV screen was placed at about 3 m. On this screen, a graphical user interface (GUI) displayed images that guided the participant through the experiment. Each experimental session was divided into two phases: an offline phase and an online phase.
The offline experiments consisted of recording participants´ EEG signals during motor imagery trials for standing and sitting that were guided by the GUI presented on the TV screen. Six offline runs were conducted in which the participants were standing in three runs and sitting in the other three runs. In each run, the participant had to repeat a block of 30 trials of mental tasks indicated by visual cues continuously presented on the screen in a pseudo-random sequence.
The first phase of the experimental session was conducted to construct the offline parts of the dataset: (A) Sit-to-stand and (B) Stand-to-sit. The participant´s EEG data were collected from 90 sequences for part A (45 trials of MotorImageryA tasks and 45 trials of IdleStateA tasks) and 90 sequences for part B (45 trials of MotorImageryB tasks and 45 trials of IdleStateB tasks).
For each participant, the two machine learning models obtained in the offline phase were used to carry out the online experiment parts of the dataset: (C) Sit-to-stand and (D) Stand-to-sit. Each participant was instructed to select, in no particular order, 30 sequences for part C (15 trials of MotorImageryA tasks and 15 trials of IdleStateA tasks) and 30 other sequences for part D (15 trials of MotorImageryB tasks and 15 trials of IdleStateB tasks). Each trial was unique and was generated pseudo-randomly before the experiment.
The database consisted of 32 electroencephalographic files corresponding to the 32 participants. All recordings were collected on channels F3, Fz, F4, FC5, FC1, FC2, FC6, C3, Cz, C4, CP5, CP1, CP2, CP6, P3, Pz, and P4 according to the 10-20 EEG electrode placement standard, grounded to AFz channel and referenced to right mastoid (M2). Each data file contained the data stream in a 2D matrix where rows corresponded to channels and columns corresponded to time samples with a sampling frequency of 250Hz.
The following marker numbers encoded information about the execution of the experiment. Marker numbers 200, 201, 202, and 203, indicated the beginning and end of the four steps of the sequence in a trial (resting, fixation, action observation, and imagining). Marker numbers 1, 2, 3, and 4, indicated the figure activated on the screen to the participant perform the task corresponding to 1. actively imagining the sit-to-stand movement (labeled as MotorImageryA), 2. sitting motionless without imagining the sit-to-stand movement (labeled as IdleStateA), 3. standing motionless while actively imagining the stand-to-sit movement (labeled as MotorImageryB), or 4. standing motionless without imagining the stand-to-sit movement (labeled as IdleStateB). Finally, marker numbers 101, 102, 103, and 104, indicated the task detected by the BCI in real time during the online experiment: 101. MotorImageryA, 102. IdleStateA, 103. MotorImageryB, or 104. IdleStateB.
Dataset Information#
Dataset ID |
|
Title |
EEG data offline and online during motor imagery for standing and sitting |
Year |
2024 |
Authors |
Nayid Triana-Guzman, Alvaro D Orjuela-Cañon, Andres L Jutinico, Omar Mendoza-Montoya, Javier M Antelis |
License |
CC0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{ds005342,
title = {EEG data offline and online during motor imagery for standing and sitting},
author = {Nayid Triana-Guzman and Alvaro D Orjuela-Cañon and Andres L Jutinico and Omar Mendoza-Montoya and Javier M Antelis},
doi = {10.18112/openneuro.ds005342.v1.0.3},
url = {https://doi.org/10.18112/openneuro.ds005342.v1.0.3},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 32
Recordings: 134
Tasks: 1
Channels: 17
Sampling rate (Hz): 250.0
Duration (hours): 0.0
Pathology: Not specified
Modality: —
Type: —
Size on disk: 2.0 GB
File count: 134
Format: BIDS
License: CC0
DOI: doi:10.18112/openneuro.ds005342.v1.0.3
API Reference#
Use the DS005342 class to access this dataset programmatically.
- class eegdash.dataset.DS005342(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetOpenNeuro dataset
ds005342. Modality:eeg; Experiment type:Motor; Subject type:Healthy. Subjects: 32; recordings: 32; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/ds005342 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds005342
Examples
>>> from eegdash.dataset import DS005342 >>> dataset = DS005342(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset