DS005811#
NOD-EEG
Access recordings and metadata through EEGDash.
Citation: Guohao Zhang, Ming Zhou, Shaohua Tang, Shuyi Zhen, Zheng Li, Zonglei Zhen (2025). NOD-EEG. 10.18112/openneuro.ds005811.v1.0.8
Modality: eeg Subjects: 19 Recordings: 2413 License: CC0 Source: openneuro
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import DS005811
dataset = DS005811(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = DS005811(cache_dir="./data", subject="01")
Advanced query
dataset = DS005811(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{ds005811,
title = {NOD-EEG},
author = {Guohao Zhang and Ming Zhou and Shaohua Tang and Shuyi Zhen and Zheng Li and Zonglei Zhen},
doi = {10.18112/openneuro.ds005811.v1.0.8},
url = {https://doi.org/10.18112/openneuro.ds005811.v1.0.8},
}
About This Dataset#
Summary
The human brain can rapidly recognize meaningful objects from natural scenes encountered in everyday life. Neuroimaging with large-scale naturalistic stimuli is increasingly employed to elucidate these neural mechanisms of object recognition across these rich and daily natural scenes. However, most existing large-scale neuroimaging datasets with naturalistic stimuli primarily rely on functional magnetic resonance imaging (fMRI), which provides high spatial resolution to characterize spatial representation patterns but is limited in capturing the temporal dynamics inherent in visual cognitive processing.
To address this limitation, we extended our previously collected Natural Object Dataset-fMRI (NOD-fMRI) by collecting both magnetoencephalography (MEG) and electroencephalography (EEG) data from the same subjects while viewing the same set of naturalistic stimuli. As a result, the NOD uniquely integrates three different modalities—fMRI, MEG, and EEG—thus offering promising avenues to examine brain activity induced by naturalistic stimuli with both high spatial and high temporal resolutions. Additionally, the NOD encompasses a diverse array of naturalistic stimuli and a broader subject pool, enabling researchers to explore differences in neural activation patterns across both stimuli and subjects.
We anticipate that the NOD dataset will serve as a valuable resource for advancing our understanding of the cognitive and neural mechanisms underlying object recognition.
View full README
Summary
The human brain can rapidly recognize meaningful objects from natural scenes encountered in everyday life. Neuroimaging with large-scale naturalistic stimuli is increasingly employed to elucidate these neural mechanisms of object recognition across these rich and daily natural scenes. However, most existing large-scale neuroimaging datasets with naturalistic stimuli primarily rely on functional magnetic resonance imaging (fMRI), which provides high spatial resolution to characterize spatial representation patterns but is limited in capturing the temporal dynamics inherent in visual cognitive processing.
To address this limitation, we extended our previously collected Natural Object Dataset-fMRI (NOD-fMRI) by collecting both magnetoencephalography (MEG) and electroencephalography (EEG) data from the same subjects while viewing the same set of naturalistic stimuli. As a result, the NOD uniquely integrates three different modalities—fMRI, MEG, and EEG—thus offering promising avenues to examine brain activity induced by naturalistic stimuli with both high spatial and high temporal resolutions. Additionally, the NOD encompasses a diverse array of naturalistic stimuli and a broader subject pool, enabling researchers to explore differences in neural activation patterns across both stimuli and subjects.
We anticipate that the NOD dataset will serve as a valuable resource for advancing our understanding of the cognitive and neural mechanisms underlying object recognition.
The MEG data’s accession number is ds005810.
Data Records
Directory Structure
The raw data from each subject are stored in the sub-subID directory, while preprocessed data and epoch data are stored in the following directories:
- Preprocessed Data: derivatives/preprocessed/raw
- Epoch Data: derivatives/preprocessed/epochs
Stimulus Images
The stimulus images used for MEG and EEG are identical and are stored in the stimuli/ImageNet directory. Images within this folder are named in the synsetID_imageID.JPEG Where:
- synsetID is the ILSVRC category information.
- imageID is the unique number for the image within that category.
The image metadata, including category information, is available in the table files under the stimuli/metadata directory.
Raw Data
Raw EEG data are stored in BIDS format. Each subject’s directory contains multiple session folders, designated as ses-sesID. Comprehensive trial information for each subject is documented in the file: derivatives/detailed_events/sub-subID_events.csv Where each row corresponds to a trial, and each column contains metadata for that trial, including the session and run number, category information of the stimuli, and subject response.
Preprocessed Data
The full time series data of preprocessed data are archived in the derivatives/raw directory, named as: sub-subID_ses-sesID_task-ImageNet_run-runID_eeg_clean.fif. The epoch data derived from preprocessed data are stored within the derivatives/epochs directory. In this directory, all data for each subject are concatenated into a single file, labeled as: sub-subID_epo.fif
The trial information within each subject’s epochs data can be accessed via the metadata of the epochs data, which are aligned with the content of the subject’s sub-subID_events.csv file.
Dataset Information#
Dataset ID |
|
Title |
NOD-EEG |
Year |
2025 |
Authors |
Guohao Zhang, Ming Zhou, Shaohua Tang, Shuyi Zhen, Zheng Li, Zonglei Zhen |
License |
CC0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{ds005811,
title = {NOD-EEG},
author = {Guohao Zhang and Ming Zhou and Shaohua Tang and Shuyi Zhen and Zheng Li and Zonglei Zhen},
doi = {10.18112/openneuro.ds005811.v1.0.8},
url = {https://doi.org/10.18112/openneuro.ds005811.v1.0.8},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 19
Recordings: 2413
Tasks: 1
Channels: 62 (448), 64 (440), 66 (8)
Sampling rate (Hz): 500.0 (576), 1000.0 (320)
Duration (hours): 0.0
Pathology: Healthy
Modality: Visual
Type: Perception
Size on disk: 14.1 GB
File count: 2413
Format: BIDS
License: CC0
DOI: doi:10.18112/openneuro.ds005811.v1.0.8
API Reference#
Use the DS005811 class to access this dataset programmatically.
- class eegdash.dataset.DS005811(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetOpenNeuro dataset
ds005811. Modality:eeg; Experiment type:Perception; Subject type:Healthy. Subjects: 19; recordings: 448; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/ds005811 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds005811
Examples
>>> from eegdash.dataset import DS005811 >>> dataset = DS005811(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset