DS003922#
Multisensory Correlation Detector
Access recordings and metadata through EEGDash.
Citation: Pesnot Lerousseau, J., Parise, C., Ernst, MO., van Wassenhove, V. (2021). Multisensory Correlation Detector. 10.18112/openneuro.ds003922.v1.0.1
Modality: meg Subjects: 13 Recordings: 674 License: CC0 Source: openneuro Citations: 1.0
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import DS003922
dataset = DS003922(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = DS003922(cache_dir="./data", subject="01")
Advanced query
dataset = DS003922(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{ds003922,
title = {Multisensory Correlation Detector},
author = {Pesnot Lerousseau, J. and Parise, C. and Ernst, MO. and van Wassenhove, V.},
doi = {10.18112/openneuro.ds003922.v1.0.1},
url = {https://doi.org/10.18112/openneuro.ds003922.v1.0.1},
}
About This Dataset#
DESCRIPTION
Magnetoencephalography (MEG) dataset recorded during the presentation of audiovisual sequences with a causality judgment task and temporal order judgment task. This MEG dataset was prepared in the Brain Imaging Data Structure (MEG-BIDS, Niso et al. 2018) format using MNE-BIDS (Appelhoff et al. 2019).
PUBLISHED IN
Pesnot Lerousseau, J., Parise, C., Ernst, MO., van Wassenhove, V. (2022). Multisensory correlation computations in the human brain identified by a time-resolved encoding model. *Nature Communications*. http://doi.org/10.1038/s41467-022-29687-6
View full README
DESCRIPTION
Magnetoencephalography (MEG) dataset recorded during the presentation of audiovisual sequences with a causality judgment task and temporal order judgment task. This MEG dataset was prepared in the Brain Imaging Data Structure (MEG-BIDS, Niso et al. 2018) format using MNE-BIDS (Appelhoff et al. 2019).
PUBLISHED IN
Pesnot Lerousseau, J., Parise, C., Ernst, MO., van Wassenhove, V. (2022). Multisensory correlation computations in the human brain identified by a time-resolved encoding model. *Nature Communications*. http://doi.org/10.1038/s41467-022-29687-6
PARTICIPANTS
The dataset contains 13 participants (Ab140232, Jl150443, Mm150194, Al150424, Mp110340, Rt160359, Cb140229, Cc160310, Lb160367, Mb160304, Mk150295, Sl160372, Mp150285).
EXPERIMENT
The experiment consisted of 10 consecutive recording blocks of 8 minutes each, whose order was counterbalanced across participants. Three blocks tested participants on a Causality judgement, and three blocks tested participants with a Temporal order judgement. Importantly, the same audiovisual sequences were used in both tasks in order to maintain a constant flow of feedforward multisensory inputs while manipulating the endogenous task requirements. Each block was composed of 25 repetitions of the 6 possible audiovisual sequences. A total of 75 presentations of each stimulus sequence were thus tested in each task. Four additional recording blocks consisted of participants passively hearing (auditory localizer, 2 blocks) or viewing (visual localizer, 2 blocks) one constitutive modality of the audiovisual sequence. Each localizer block was composed of 25 repetitions of the 6 possible stimuli (auditory or visual part of each stimuli), yielding a total of 50 presentations of each auditory and visual stimuli (2 tasks x 3 blocks x 25 repetitions x 6 sequences + 2 modalities x 25 repetitions x 2 blocks x 6 sequences = 1500 trials in total).
STIMULI
Six audiovisual sequences were presented (DD, DC, CC, AA, AV, VV).
BLOCKS
Ten blocks were presented (3 Causality, 3 Temporal, 2 Auditory, 2 Visual).
EVENTS
‘Causality/DD’:11
‘Causality/DC’:12
‘Causality/CC’:13
‘Causality/AA’:14
‘Causality/AV’:15
‘Causality/VV’:16
‘Temporal/DD’:21
‘Temporal/DC’:22
‘Temporal/CC’:23
‘Temporal/AA’:24
‘Temporal/AV’:25
‘Temporal/VV’:26
‘Auditory/DD’:41
‘Auditory/DC’:42
‘Auditory/CC’:43
‘Auditory/AA’:44
‘Auditory/AV’:45
‘Auditory/VV’:46
‘Visual/DD’:51
‘Visual/DC’:52
‘Visual/CC’:53
‘Visual/AA’:54
‘Visual/AV’:55
‘Visual/VV’:56
MEG
Brain magnetic fields were recorded in a MSR using a 306 MEG system (Neuromag Elekta LTD, Helsinki). MEG recordings were sampled at 1 kHz and band-pass filtered between 0.03 Hz and 330 Hz.
Four head position coils (HPI) measured the head position of participants before each block; three fiducial markers (nasion and pre-auricular points) were used for digitization and anatomicalMRI (aMRI) immediately following MEG acquisition.
Electrooculograms (EOG, horizontal and vertical eye movements) and electrocardiogram (ECG) were simultaneously recorded. Prior to the session, 2 min of empty room recordings was acquired for the computation of the noise covariance matrix.
Bad MEG channels were marked manually.
MRI
The T1 weighted aMRI was recorded using a 3-T Siemens Trio MRI scanner. Parameters of the sequence were: voxel size: 1.0 × 1.0 × 1.1 mm; acquisition time: 466 s; repetition time TR = 2300 ms; and echo time TE = 2.98 ms
BEHAVIOR
File sourcedata/behavioral_data.txt
REFERENCES
Pesnot Lerousseau, J., Parise, C., Ernst, MO., van Wassenhove, V. (2022). Multisensory correlation computations in the human brain identified by a time-resolved encoding model. Nature Communications. http://doi.org/10.1038/s41467-022-29687-6
Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Höchenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896
Niso, G., Gorgolewski, K. J., Bock, E., Brooks, T. L., Flandin, G., Gramfort, A., Henson, R. N., Jas, M., Litvak, V., Moreau, J., Oostenveld, R., Schoffelen, J., Tadel, F., Wexler, J., Baillet, S. (2018). MEG-BIDS, the brain imaging data structure extended to magnetoencephalography. Scientific Data, 5, 180110. http://doi.org/10.1038/sdata.2018.110
Dataset Information#
Dataset ID |
|
Title |
Multisensory Correlation Detector |
Year |
2021 |
Authors |
Pesnot Lerousseau, J., Parise, C., Ernst, MO., van Wassenhove, V. |
License |
CC0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{ds003922,
title = {Multisensory Correlation Detector},
author = {Pesnot Lerousseau, J. and Parise, C. and Ernst, MO. and van Wassenhove, V.},
doi = {10.18112/openneuro.ds003922.v1.0.1},
url = {https://doi.org/10.18112/openneuro.ds003922.v1.0.1},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 13
Recordings: 674
Tasks: 3
Channels: 306 (151), 342 (128), 323 (23)
Sampling rate (Hz): 1000.0
Duration (hours): 0.0
Pathology: Healthy
Modality: Multisensory
Type: Perception
Size on disk: 75.7 GB
File count: 674
Format: BIDS
License: CC0
DOI: doi:10.18112/openneuro.ds003922.v1.0.1
API Reference#
Use the DS003922 class to access this dataset programmatically.
- class eegdash.dataset.DS003922(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetOpenNeuro dataset
ds003922. Modality:meg; Experiment type:Perception; Subject type:Healthy. Subjects: 14; recordings: 164; tasks: 3.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/ds003922 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds003922
Examples
>>> from eegdash.dataset import DS003922 >>> dataset = DS003922(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset