DS005810#

NOD-MEG

Access recordings and metadata through EEGDash.

Citation: Guohao Zhang, Ming Zhou, Shaohua Tang, Shuyi Zhen, Zheng Li, Zonglei Zhen (2025). NOD-MEG. 10.18112/openneuro.ds005810.v1.0.6

Modality: meg Subjects: 30 Recordings: 1579 License: CC0 Source: openneuro

Metadata: Complete (100%)

Quickstart#

Install

pip install eegdash

Access the data

from eegdash.dataset import DS005810

dataset = DS005810(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)

Filter by subject

dataset = DS005810(cache_dir="./data", subject="01")

Advanced query

dataset = DS005810(
    cache_dir="./data",
    query={"subject": {"$in": ["01", "02"]}},
)

Iterate recordings

for rec in dataset:
    print(rec.subject, rec.raw.info['sfreq'])

If you use this dataset in your research, please cite the original authors.

BibTeX

@dataset{ds005810,
  title = {NOD-MEG},
  author = {Guohao Zhang and Ming Zhou and Shaohua Tang and Shuyi Zhen and Zheng Li and Zonglei Zhen},
  doi = {10.18112/openneuro.ds005810.v1.0.6},
  url = {https://doi.org/10.18112/openneuro.ds005810.v1.0.6},
}

About This Dataset#

Summary

The human brain can rapidly recognize meaningful objects from natural scenes encountered in everyday life. Neuroimaging with large-scale naturalistic stimuli is increasingly employed to elucidate these neural mechanisms of object recognition across these rich and daily natural scenes. However, most existing large-scale neuroimaging datasets with naturalistic stimuli primarily rely on functional magnetic resonance imaging (fMRI), which provides high spatial resolution to characterize spatial representation patterns but is limited in capturing the temporal dynamics inherent in visual cognitive processing.

To address this limitation, we extended our previously collected Natural Object Dataset-fMRI (NOD-fMRI) by collecting both magnetoencephalography (MEG) and electroencephalography (EEG) data from the same subjects while viewing the same set of naturalistic stimuli. As a result, the NOD uniquely integrates three different modalities—fMRI, MEG, and EEG—thus offering promising avenues to examine brain activity induced by naturalistic stimuli with both high spatial and high temporal resolutions. Additionally, the NOD encompasses a diverse array of naturalistic stimuli and a broader subject pool, enabling researchers to explore differences in neural activation patterns across both stimuli and subjects.

We anticipate that the NOD dataset will serve as a valuable resource for advancing our understanding of the cognitive and neural mechanisms underlying object recognition.

View full README

Summary

The human brain can rapidly recognize meaningful objects from natural scenes encountered in everyday life. Neuroimaging with large-scale naturalistic stimuli is increasingly employed to elucidate these neural mechanisms of object recognition across these rich and daily natural scenes. However, most existing large-scale neuroimaging datasets with naturalistic stimuli primarily rely on functional magnetic resonance imaging (fMRI), which provides high spatial resolution to characterize spatial representation patterns but is limited in capturing the temporal dynamics inherent in visual cognitive processing.

To address this limitation, we extended our previously collected Natural Object Dataset-fMRI (NOD-fMRI) by collecting both magnetoencephalography (MEG) and electroencephalography (EEG) data from the same subjects while viewing the same set of naturalistic stimuli. As a result, the NOD uniquely integrates three different modalities—fMRI, MEG, and EEG—thus offering promising avenues to examine brain activity induced by naturalistic stimuli with both high spatial and high temporal resolutions. Additionally, the NOD encompasses a diverse array of naturalistic stimuli and a broader subject pool, enabling researchers to explore differences in neural activation patterns across both stimuli and subjects.

We anticipate that the NOD dataset will serve as a valuable resource for advancing our understanding of the cognitive and neural mechanisms underlying object recognition.

The EEG data’s accession number is ds005811.

Data Records

Directory Structure

The raw data from each subject are stored in the sub-subID directory, while preprocessed data and epoch data are stored in the following directories: - Preprocessed Data: derivatives/preprocessed/raw - Epoch Data: derivatives/preprocessed/epochs

Stimulus Images

The stimulus images used for MEG and EEG are identical and are stored in the stimuli/ImageNet directory. Images within this folder are named in the synsetID_imageID.JPEG Where: - synsetID is the ILSVRC category information. - imageID is the unique number for the image within that category.

The image metadata, including category information, is available in the table files under the stimuli/metadata directory.

Raw Data

Raw MEG data are stored in BIDS format. Each subject’s directory contains multiple session folders, designated as ses-sesID. Comprehensive trial information for each subject is documented in the file: derivatives/detailed_events/sub-subID_events.csv Where each row corresponds to a trial, and each column contains metadata for that trial, including the session and run number, category information of the stimuli, and subject response.

Preprocessed Data

The full time series data of preprocessed data are archived in the derivatives/raw directory, named as: sub-subID_ses-sesID_task-ImageNet_run-runID_meg_clean.fif. The epoch data derived from preprocessed data are stored within the derivatives/epochs directory. In this directory, all data for each subject are concatenated into a single file, labeled as: sub-subID_epo.fif

The trial information within each subject’s epochs data can be accessed via the metadata of the epochs data, which are aligned with the content of the subject’s sub-subID_events.csv file.

Dataset Information#

Dataset ID

DS005810

Title

NOD-MEG

Year

2025

Authors

Guohao Zhang, Ming Zhou, Shaohua Tang, Shuyi Zhen, Zheng Li, Zonglei Zhen

License

CC0

Citation / DOI

doi:10.18112/openneuro.ds005810.v1.0.6

Source links

OpenNeuro | NeMAR | Source URL

Copy-paste BibTeX
@dataset{ds005810,
  title = {NOD-MEG},
  author = {Guohao Zhang and Ming Zhou and Shaohua Tang and Shuyi Zhen and Zheng Li and Zonglei Zhen},
  doi = {10.18112/openneuro.ds005810.v1.0.6},
  url = {https://doi.org/10.18112/openneuro.ds005810.v1.0.6},
}

Found an issue with this dataset?

If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!

Report an Issue on GitHub

Technical Details#

Subjects & recordings
  • Subjects: 30

  • Recordings: 1579

  • Tasks: 2

Channels & sampling rate
  • Channels: 273 (290), 409 (262), 378 (20)

  • Sampling rate (Hz): 1200.0

  • Duration (hours): 0.0

Tags
  • Pathology: Healthy

  • Modality: Visual

  • Type: Perception

Files & format
  • Size on disk: 149.5 GB

  • File count: 1579

  • Format: BIDS

License & citation
  • License: CC0

  • DOI: doi:10.18112/openneuro.ds005810.v1.0.6

Provenance

API Reference#

Use the DS005810 class to access this dataset programmatically.

class eegdash.dataset.DS005810(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#

Bases: EEGDashDataset

OpenNeuro dataset ds005810. Modality: meg; Experiment type: Perception; Subject type: Healthy. Subjects: 31; recordings: 286; tasks: 2.

Parameters:
  • cache_dir (str | Path) – Directory where data are cached locally.

  • query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key dataset.

  • s3_bucket (str | None) – Base S3 bucket used to locate the data.

  • **kwargs (dict) – Additional keyword arguments forwarded to EEGDashDataset.

data_dir#

Local dataset cache directory (cache_dir / dataset_id).

Type:

Path

query#

Merged query with the dataset filter applied.

Type:

dict

records#

Metadata records used to build the dataset, if pre-fetched.

Type:

list[dict] | None

Notes

Each item is a recording; recording-level metadata are available via dataset.description. query supports MongoDB-style filters on fields in ALLOWED_QUERY_FIELDS and is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.

References

OpenNeuro dataset: https://openneuro.org/datasets/ds005810 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds005810

Examples

>>> from eegdash.dataset import DS005810
>>> dataset = DS005810(cache_dir="./data")
>>> recording = dataset[0]
>>> raw = recording.load()
__init__(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
save(path, overwrite=False)[source]#

Save the dataset to disk.

Parameters:
  • path (str or Path) – Destination file path.

  • overwrite (bool, default False) – If True, overwrite existing file.

Return type:

None

See Also#