DS004212#
THINGS-MEG
Access recordings and metadata through EEGDash.
Citation: Martin N. Hebart, Oliver Contier, Lina Teichmann, Adam H. Rockter, Charles Zheng, Alexis Kidder, Anna Corriveau, Maryam Vaziri-Pashkam, Chris I. Baker (2022). THINGS-MEG. 10.18112/openneuro.ds004212.v3.0.0
Modality: meg Subjects: 4 Recordings: 2896 License: CC0 Source: openneuro Citations: 3.0
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import DS004212
dataset = DS004212(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = DS004212(cache_dir="./data", subject="01")
Advanced query
dataset = DS004212(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{ds004212,
title = {THINGS-MEG},
author = {Martin N. Hebart and Oliver Contier and Lina Teichmann and Adam H. Rockter and Charles Zheng and Alexis Kidder and Anna Corriveau and Maryam Vaziri-Pashkam and Chris I. Baker},
doi = {10.18112/openneuro.ds004212.v3.0.0},
url = {https://doi.org/10.18112/openneuro.ds004212.v3.0.0},
}
About This Dataset#
THINGS-MEG
Understanding object representations visual and semantic processing of objects requires a broad, comprehensive sampling of the objects in our visual world with dense measurements of brain activity and behavior. This densely sampled fMRI dataset is part of THINGS-data, a multimodal collection of large-scale datasets comprising functional MRI, magnetoencephalographic recordings, and 4.70 million behavioral judgments in response to thousands of photographic images for up to 1,854 object concepts. THINGS-data is unique in its breadth of richly-annotated objects, allowing for testing countless novel hypotheses at scale while assessing the reproducibility of previous findings. The multimodal data allows for studying both the temporal and spatial dynamics of object representations and their relationship to behavior and additionally provides the means for combining these datasets for novel insights into object processing. THINGS-data constitutes the core release of the THINGS initiative_ for bridging the gap between disciplines and the advancement of cognitive neuroscience.
Dataset overview
We collected extensively sampled object representations using magnetoencephalography (MEG). To this end, we drew on the THINGS database (Hebart et al., 2019)_, a richly-annotated database of 1,854 object concepts representative of the American English language which contains 26,107 manually-curated naturalistic object images.
During the fMRI experiment, participants were shown a representative subset of THINGS images, spread across 12 separate sessions (N=4, 22,448 unique images of 1,854 objects). Images were shown in fast succession (1.5±0.2s), and participants were instructed to maintain central fixation. To ensure engagement, participants performed an oddball detection task responding to occasional artificially-generated images. A subset of images (n=200) were shown repeatedly in each session.
Beyond the core functional imaging data in response to THINGS images, we acquired T1-weighted MRI scans to allow for cortical source localization. Eye movements were monitored in the MEG to ensure participants maintained central fixation.
Dataset Information#
Dataset ID |
|
Title |
THINGS-MEG |
Year |
2022 |
Authors |
Martin N. Hebart, Oliver Contier, Lina Teichmann, Adam H. Rockter, Charles Zheng, Alexis Kidder, Anna Corriveau, Maryam Vaziri-Pashkam, Chris I. Baker |
License |
CC0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{ds004212,
title = {THINGS-MEG},
author = {Martin N. Hebart and Oliver Contier and Lina Teichmann and Adam H. Rockter and Charles Zheng and Alexis Kidder and Anna Corriveau and Maryam Vaziri-Pashkam and Chris I. Baker},
doi = {10.18112/openneuro.ds004212.v3.0.0},
url = {https://doi.org/10.18112/openneuro.ds004212.v3.0.0},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 4
Recordings: 2896
Tasks: 1
Channels: 310 (472), 272 (468)
Sampling rate (Hz): 1200.0
Duration (hours): 0.0
Pathology: Not specified
Modality: —
Type: —
Size on disk: 237.7 GB
File count: 2896
Format: BIDS
License: CC0
DOI: doi:10.18112/openneuro.ds004212.v3.0.0
API Reference#
Use the DS004212 class to access this dataset programmatically.
- class eegdash.dataset.DS004212(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetOpenNeuro dataset
ds004212. Modality:meg; Experiment type:Unknown; Subject type:Unknown. Subjects: 5; recordings: 500; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/ds004212 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds004212
Examples
>>> from eegdash.dataset import DS004212 >>> dataset = DS004212(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset