DS007353: eeg, meg dataset, 32 subjects#
HAD-MEEG
Access recordings and metadata through EEGDash.
Citation: Guohao Zhang, Sai Ma, Ming Zhou, Shaohua Tang, Shuyi Zhen, Zheng Li, Zonglei Zhen (2026). HAD-MEEG. 10.18112/openneuro.ds007353.v1.0.0
Modality: eeg, meg Subjects: 32 Recordings: 473 License: CC0 Source: openneuro
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import DS007353
dataset = DS007353(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = DS007353(cache_dir="./data", subject="01")
Advanced query
dataset = DS007353(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{ds007353,
title = {HAD-MEEG},
author = {Guohao Zhang and Sai Ma and Ming Zhou and Shaohua Tang and Shuyi Zhen and Zheng Li and Zonglei Zhen},
doi = {10.18112/openneuro.ds007353.v1.0.0},
url = {https://doi.org/10.18112/openneuro.ds007353.v1.0.0},
}
About This Dataset#
Human action recognition is a core component of social cognition, engaging spatially distributed and temporally evolving neural responses that encode visual information and infer intention. To map the brain’s spatial organization supporting this process, we previously released the Human Action Dataset (HAD), a functional magnetic resonance imaging (fMRI) resource. However, fMRI’s limited temporal resolution constrains its ability to capture rapid neural dynamics. Here, we present the HAD-MEEG dataset, which extends HAD-fMRI, leveraging the millisecond-level temporal resolution of magnetoencephalography (MEG) and electroencephalography (EEG). HAD-MEEG were recorded in the same participants and with the same stimuli as HAD-fMRI, in which 30 participants viewed 21,600 video clips spanning 180 categories of human action. By integrating the temporal precision of M/EEG with the spatial precision of fMRI, HAD enables comprehensive spatiotemporal investigation of the neural mechanisms underlying human action recognition.
Dataset Information#
Dataset ID |
|
Title |
HAD-MEEG |
Author (year) |
|
Canonical |
|
Importable as |
|
Year |
2026 |
Authors |
Guohao Zhang, Sai Ma, Ming Zhou, Shaohua Tang, Shuyi Zhen, Zheng Li, Zonglei Zhen |
License |
CC0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{ds007353,
title = {HAD-MEEG},
author = {Guohao Zhang and Sai Ma and Ming Zhou and Shaohua Tang and Shuyi Zhen and Zheng Li and Zonglei Zhen},
doi = {10.18112/openneuro.ds007353.v1.0.0},
url = {https://doi.org/10.18112/openneuro.ds007353.v1.0.0},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 32
Recordings: 473
Tasks: 2
Channels: 409 (240), 64 (224), 378 (9)
Sampling rate (Hz): 1200.0 (249), 1000.0 (224)
Duration (hours): 44.82657291666667
Pathology: Healthy
Modality: Visual
Type: Perception
Size on disk: 180.6 GB
File count: 473
Format: BIDS
License: CC0
DOI: doi:10.18112/openneuro.ds007353.v1.0.0
API Reference#
Use the DS007353 class to access this dataset programmatically.
- class eegdash.dataset.DS007353(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetHAD-MEEG
- Study:
ds007353(OpenNeuro)- Author (year):
Zhang2026- Canonical:
HAD_MEEG,HADMEEG
Also importable as:
DS007353,Zhang2026,HAD_MEEG,HADMEEG.Modality:
eeg, meg; Experiment type:Perception; Subject type:Healthy. Subjects: 32; recordings: 473; tasks: 2.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/ds007353 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds007353 DOI: https://doi.org/10.18112/openneuro.ds007353.v1.0.0
Examples
>>> from eegdash.dataset import DS007353 >>> dataset = DS007353(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset