DS006735#
Chimeric music reveals an interaction of pitch and time in electrophysiological signatures of music encoding
Access recordings and metadata through EEGDash.
Citation: Tong Shan, Edmund C. Lalor, Ross K. Maddox (2025). Chimeric music reveals an interaction of pitch and time in electrophysiological signatures of music encoding. 10.18112/openneuro.ds006735.v2.0.0
Modality: eeg Subjects: 27 Recordings: 220 License: CC0 Source: openneuro
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import DS006735
dataset = DS006735(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = DS006735(cache_dir="./data", subject="01")
Advanced query
dataset = DS006735(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{ds006735,
title = {Chimeric music reveals an interaction of pitch and time in electrophysiological signatures of music encoding},
author = {Tong Shan and Edmund C. Lalor and Ross K. Maddox},
doi = {10.18112/openneuro.ds006735.v2.0.0},
url = {https://doi.org/10.18112/openneuro.ds006735.v2.0.0},
}
About This Dataset#
Details related to access to the data
Please contact the following authors for further information:
Tong Shan (email: tongshan@stanford.edu)
Ross K. Maddox (email: rkmaddox@med.umich.edu)
View full README
Details related to access to the data
Please contact the following authors for further information:
Tong Shan (email: tongshan@stanford.edu)
Ross K. Maddox (email: rkmaddox@med.umich.edu)
Overview
This study examines pitch-time interactions in music processing by introducing “chimeric music,” which pairs two distinct melodies, and exchanges their pitch contours and note onset-times to create two new melodies, thereby distorting musical pattern while maintaining the marginal statistics of the original pieces’ pitch and temporal sequences.
Data collected from Sep to Nov, 2023.
The details of the experiment can be found at Shan et al. (2024). There were two phases in this experiment. For the first phase, ten trials of one-minute clicks were presented to the subjects. For the second phase, the 2 types of monophonic music (original and chimeric) clips were presented. There were 33 trials for each type with shuffled order. Between trials, there was a 0.5 s pause.
The code for analysis for this study can be found in GitHub repo (maddoxlab/Chimeric_music).
Format
This dataset is formatted according to the EEG Brain Imaging Data Structure. It includes EEG recording from subject 001 to subject 027 in raw brainvision format (including .eeg, .vhdr, and .vmrk triplet).
Subjects
27 subjects participated in this study.
Subject inclusion criteria
Age between 18-40.
Normal hearing: audiometric thresholds of 20 dB HL or better from 500 to 8000 Hz. Speak English as their primary language.
Self-reported normal or correctable to normal vision.
Twenty-seven participants participated in this experiment with an age of 22.9 ± 3.9 (mean ± STD) years.
Apparatus
Subjects were seated in a sound-isolating booth on a chair in front of a 24-inch BenQ monitor with a viewing distance of approximately 60 cm. Stimuli were presented at an average level of 60 dB SPL and a sampling rate of 48000 Hz through ER-2 insert earphones plugged into an RME Babyface Pro digital sound card. The stimulus presentation for the experiment was controlled by a python script using a custom package, expyfun.
Following the experimental session, participants completed a self-reported musicianship questionnaire (adapted from Whiteford et al, 2025). The questionnaire is included in this repository.
Dataset Information#
Dataset ID |
|
Title |
Chimeric music reveals an interaction of pitch and time in electrophysiological signatures of music encoding |
Year |
2025 |
Authors |
Tong Shan, Edmund C. Lalor, Ross K. Maddox |
License |
CC0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{ds006735,
title = {Chimeric music reveals an interaction of pitch and time in electrophysiological signatures of music encoding},
author = {Tong Shan and Edmund C. Lalor and Ross K. Maddox},
doi = {10.18112/openneuro.ds006735.v2.0.0},
url = {https://doi.org/10.18112/openneuro.ds006735.v2.0.0},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 27
Recordings: 220
Tasks: 1
Channels: 36 (48), 63 (4), 34 (2)
Sampling rate (Hz): 10000.0
Duration (hours): 0.0
Pathology: Healthy
Modality: Auditory
Type: Perception
Size on disk: 175.9 GB
File count: 220
Format: BIDS
License: CC0
DOI: doi:10.18112/openneuro.ds006735.v2.0.0
API Reference#
Use the DS006735 class to access this dataset programmatically.
- class eegdash.dataset.DS006735(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetOpenNeuro dataset
ds006735. Modality:eeg; Experiment type:Perception; Subject type:Healthy. Subjects: 27; recordings: 27; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/ds006735 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds006735
Examples
>>> from eegdash.dataset import DS006735 >>> dataset = DS006735(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset