DS007471: eeg dataset, 31 subjects#
Joint agency EEG dataset
Access recordings and metadata through EEGDash.
Citation: Zijun Zhou, Anna Zamm, Justin Christensen, Vinesh Rao, Janeen Loehr (2026). Joint agency EEG dataset. 10.18112/openneuro.ds007471.v1.0.0
Modality: eeg Subjects: 31 Recordings: 31 License: CC0 Source: openneuro
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import DS007471
dataset = DS007471(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = DS007471(cache_dir="./data", subject="01")
Advanced query
dataset = DS007471(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{ds007471,
title = {Joint agency EEG dataset},
author = {Zijun Zhou and Anna Zamm and Justin Christensen and Vinesh Rao and Janeen Loehr},
doi = {10.18112/openneuro.ds007471.v1.0.0},
url = {https://doi.org/10.18112/openneuro.ds007471.v1.0.0},
}
About This Dataset#
Behavioural and EEG data from an EEG hyperscanning study examining cognitive and neural signals underlying the sense of joint agency during a musical joint action task
Dataset Structure
The primary folder includes a separate folder for each pair:sub-## Each pair folder contains:
Behavioural Data
View full README
Behavioural and EEG data from an EEG hyperscanning study examining cognitive and neural signals underlying the sense of joint agency during a musical joint action task
Dataset Structure
The primary folder includes a separate folder for each pair:sub-## Each pair folder contains:
Behavioural Data
Located in:sub-##/beh/ File:sub-##_task-jointaction_beh.tsv
EEG Data
Located in:sub-##/eeg/ Files (BrainVision format): sub-##_task-jointaction_eeg.eeg sub-##_task-jointaction_eeg.vhdr
sub-##task-jointactioneeg.vmrk
Derivatives Folder
The derivatives/ folder contains:
- behavioural_all.tsv
Compiled behavioural data across all pairs.
32chanElectrodePositions.elp
Electrode positions used for EEG data acquisition and analysis.
Behavioural Data Description
The following column descriptions apply to both:
- behavioural_all.tsv
- sub-##task-jointactionbeh.tsv
Pair Number
Values: 1–32
Participant Number
The first one or two digits represent the pair number.
The last digit represents seating position: -
1= left participant -2= right participant
Examples:
- 11 = left participant in pair 1
- 202 = right participant in pair 20
Block Number
Test block number for a given trial (1–8).
Trial Number
Each pair performed: - 8 tone sequences
4 musical duets
4 constant pitch sequences
5 joint trials per sequence
Total: - 40 test trials per pair
- Trial numbers range from 1–40
Experimental Condition
0= constant pitch sequences
- 1 = musical duets
Part Performed
Indicates which part of the tone sequence the participant performed:
- 0 = higher-pitch part (for constant pitch sequences) or melody part (for musical duets)
- 1 = lower-pitch part (for constant pitch sequences) or accompaniment part (for musical duets)
Tone Sequence
Twinkle Twinkle Little Star
Hush Little Baby
B.I.N.G.O.
Yankee Doodle
Constant pitch sequence with A4 as higher-pitch part
Constant pitch sequence with C5 as higher-pitch part
Constant pitch sequence with E♭5 as higher-pitch part
8. Constant pitch sequence with F♯5 as higher-pitch part
Joint Agency Ratings
Self-reported rating scale: 1–7
Mean Synchronization Performance
The mean synchronization performance for each trial was calculated as follows. First, we calculated the absolute asynchrony between the two participants’ note onsets at each beat. Then, we converted each asynchrony to a proportion of the inter-onset interval (IOI) from the preceding note onset to the current note onset, which we averaged across the two participants and across all beats in the sequence.
Standard Deviation (SD) of Synchronization Performance
The SD of synchronization performance was defined as the standard deviation of the asynchronies across all beats in a given each trial.
EEG Data Description
For each EEG dataset within each pair’s folder: - Channels 1–32: left participant EEG - Channels 33–64: right participant EEG
Data are stored in BrainVision format.
Event Codes (Test Section)
The following event markers are present during the test section (see Figure 1 for schematic reference): - S1 – the beginning of the test trials portion of the experiment - S10 – a condition marker indicating the beginning of a block of musical duets - S11 – a condition marker indicating the beginning of a block of constant pitch sequences - S105 – the start of each trial, triggered by pressing the space bar - S128 – The first five S128s mark the metronome tone onsets. Remaining S128s mark the tone onsets from the left participant’s e-music box. - S4 – tone onsets from the right participant’s e-music box - S2 – the end of the left participant’s performance, marked one beat after the last of their 16-beat tone sequence - S3 – the end of the right participant’s performance, marked one beat after the last of their 16-beat tone sequence - S106 – the end of each trial after the rating scales were completed
- S107 – the end of each block
Figure
Illustration of the event codes occurring over time in the dataset.
Notes
Data are organized in BIDS format.
BrainVision files (.eeg, .vhdr, .vmrk) contain raw hyperscanning EEG data.
Behavioural data are provided per pair and as a compiled dataset in the derivatives folder.
Dataset Information#
Dataset ID |
|
Title |
Joint agency EEG dataset |
Author (year) |
|
Canonical |
|
Importable as |
|
Year |
2026 |
Authors |
Zijun Zhou, Anna Zamm, Justin Christensen, Vinesh Rao, Janeen Loehr |
License |
CC0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{ds007471,
title = {Joint agency EEG dataset},
author = {Zijun Zhou and Anna Zamm and Justin Christensen and Vinesh Rao and Janeen Loehr},
doi = {10.18112/openneuro.ds007471.v1.0.0},
url = {https://doi.org/10.18112/openneuro.ds007471.v1.0.0},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 31
Recordings: 31
Tasks: 1
Channels: 64
Sampling rate (Hz): 1000.0
Duration (hours): 18.78229444444444
Pathology: Healthy
Modality: Auditory
Type: Other
Size on disk: 8.1 GB
File count: 31
Format: BIDS
License: CC0
DOI: doi:10.18112/openneuro.ds007471.v1.0.0
API Reference#
Use the DS007471 class to access this dataset programmatically.
- class eegdash.dataset.DS007471(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetJoint agency EEG dataset
- Study:
ds007471(OpenNeuro)- Author (year):
Zhou2026- Canonical:
Zhou2024
Also importable as:
DS007471,Zhou2026,Zhou2024.Modality:
eeg; Experiment type:Other; Subject type:Healthy. Subjects: 31; recordings: 31; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/ds007471 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds007471 DOI: https://doi.org/10.18112/openneuro.ds007471.v1.0.0
Examples
>>> from eegdash.dataset import DS007471 >>> dataset = DS007471(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset