DS006940#
Dataset: EEG-Controlled Exoskeleton for Walking and Standing - A Longitudinal Study of Healthy Individuals
Access recordings and metadata through EEGDash.
Citation: Shantanu Sarkar, Kevin Nathan, Jose L. Contreras-Vidal (2025). Dataset: EEG-Controlled Exoskeleton for Walking and Standing - A Longitudinal Study of Healthy Individuals. 10.18112/openneuro.ds006940.v1.0.0
Modality: eeg Subjects: 7 Recordings: 14094 License: CC0 Source: openneuro
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import DS006940
dataset = DS006940(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = DS006940(cache_dir="./data", subject="01")
Advanced query
dataset = DS006940(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{ds006940,
title = {Dataset: EEG-Controlled Exoskeleton for Walking and Standing - A Longitudinal Study of Healthy Individuals},
author = {Shantanu Sarkar and Kevin Nathan and Jose L. Contreras-Vidal},
doi = {10.18112/openneuro.ds006940.v1.0.0},
url = {https://doi.org/10.18112/openneuro.ds006940.v1.0.0},
}
About This Dataset#
EEG-Controlled Exoskeleton for Walking and Standing A Longitudinal Motor Imagery Study in Healthy Adults
Dataset Overview This dataset contains multimodal recordings from a brain–machine interface (BMI) training study involving seven healthy adult participants (ages 20–30, Mean = 24.3, SD = 3.8). The study focused on open-loop and closed-loop control of a lower-limb exoskeleton (Rex Bionics) using EEG and inertial sensor data. Each participant completed nine sessions over several weeks, structured into training and trial phases.
Experimental Design * Participants: 7 healthy adults (4 male, 3 female) * Sessions: 9 per participant * Training Phase: Motor imagery calibration
View full README
EEG-Controlled Exoskeleton for Walking and Standing A Longitudinal Motor Imagery Study in Healthy Adults
Dataset Overview This dataset contains multimodal recordings from a brain–machine interface (BMI) training study involving seven healthy adult participants (ages 20–30, Mean = 24.3, SD = 3.8). The study focused on open-loop and closed-loop control of a lower-limb exoskeleton (Rex Bionics) using EEG and inertial sensor data. Each participant completed nine sessions over several weeks, structured into training and trial phases.
Experimental Design * Participants: 7 healthy adults (4 male, 3 female) * Sessions: 9 per participant * Training Phase: Motor imagery calibration * Trial Phase: Closed-loop BMI control (walk/stop) * Conditions: Walk / Stop (motor imagery)
Task Structure and Naming Convention
Each session includes multiple motor imagery tasks organized as follows:
Training: The training phase is used to calibrate the BMI decoder. Participants perform motor imagery tasks without feedback.
TrialXX: The trial phase consists of 12 closed-loop BMI trials per session, labeled trial01 to trial12. During these trials, participants use motor imagery to control the exoskeleton in real time.
Block 1: Trials 1–4 Block 2: Trials 5–8 Block 3: Trials 9–12
walk6min / stop6min: After completing the 12 trials, participants perform two extended motor imagery tasks:
walk6min – Imagining continuous walking for 6 minutes stop6min – Imagining standing still for 6 minutes
Data Modalities * EEG: 60 scalp channels + 4 EOG channels * IMU: 3-axis accelerometer, gyroscope, magnetometer, and quaternion * Sensor Placement: IMUs mounted on participant forehead and exosuit back brace * Decoder Signals/Feedback: Logged control signals and BMI predictions
Additional Materials * MIQ-RS: Motor Imagery Questionnaire – Revised Second Version (PDFs in derivatives/MIQ-RS/) * Validation Tables: Data availability, synchronization, and electrode placement (derivatives/validation/) * Raw Data: Provided without filtering or artifact removal
BIDS Structure * dataset_description.json: Metadata and provenance * sub-XX/ses-YY/: EEG and IMU recordings per session * derivatives/: MIQ-RS responses and validation spreadsheets
Dataset Information#
Dataset ID |
|
Title |
Dataset: EEG-Controlled Exoskeleton for Walking and Standing - A Longitudinal Study of Healthy Individuals |
Year |
2025 |
Authors |
Shantanu Sarkar, Kevin Nathan, Jose L. Contreras-Vidal |
License |
CC0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{ds006940,
title = {Dataset: EEG-Controlled Exoskeleton for Walking and Standing - A Longitudinal Study of Healthy Individuals},
author = {Shantanu Sarkar and Kevin Nathan and Jose L. Contreras-Vidal},
doi = {10.18112/openneuro.ds006940.v1.0.0},
url = {https://doi.org/10.18112/openneuro.ds006940.v1.0.0},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 7
Recordings: 14094
Tasks: 135
Channels: 60
Sampling rate (Hz): 100.0
Duration (hours): 0.0
Pathology: Healthy
Modality: Motor
Type: Motor
Size on disk: 3.6 GB
File count: 14094
Format: BIDS
License: CC0
DOI: doi:10.18112/openneuro.ds006940.v1.0.0
API Reference#
Use the DS006940 class to access this dataset programmatically.
- class eegdash.dataset.DS006940(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetOpenNeuro dataset
ds006940. Modality:eeg; Experiment type:Motor; Subject type:Healthy. Subjects: 7; recordings: 935; tasks: 15.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/ds006940 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds006940
Examples
>>> from eegdash.dataset import DS006940 >>> dataset = DS006940(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset