DS002720#

A dataset recorded during development of a tempo-based brain-computer music interface

Access recordings and metadata through EEGDash.

Citation: Ian Daly, Nicoletta Nicolaou, Duncan Williams, Faustina Hwang, Alexis Kirke, Eduardo Miranda, Slawomir J. Nasuto (2020). A dataset recorded during development of a tempo-based brain-computer music interface. 10.18112/openneuro.ds002720.v1.0.1

Modality: eeg Subjects: 18 Recordings: 828 License: CC0 Source: openneuro Citations: 1.0

Metadata: Complete (100%)

Quickstart#

Install

pip install eegdash

Access the data

from eegdash.dataset import DS002720

dataset = DS002720(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)

Filter by subject

dataset = DS002720(cache_dir="./data", subject="01")

Advanced query

dataset = DS002720(
    cache_dir="./data",
    query={"subject": {"$in": ["01", "02"]}},
)

Iterate recordings

for rec in dataset:
    print(rec.subject, rec.raw.info['sfreq'])

If you use this dataset in your research, please cite the original authors.

BibTeX

@dataset{ds002720,
  title = {A dataset recorded during development of a tempo-based brain-computer music interface},
  author = {Ian Daly and Nicoletta Nicolaou and Duncan Williams and Faustina Hwang and Alexis Kirke and Eduardo Miranda and Slawomir J. Nasuto},
  doi = {10.18112/openneuro.ds002720.v1.0.1},
  url = {https://doi.org/10.18112/openneuro.ds002720.v1.0.1},
}

About This Dataset#

0. Sections

  1. Project

  2. Dataset

  3. Terms of Use

  4. Contents

  5. Method and Processing

View full README

0. Sections

  1. Project

  2. Dataset

  3. Terms of Use

  4. Contents

  5. Method and Processing

1. PROJECT

Title: Brain-Computer Music Interface for Monitoring and Inducing Affective States (BCMI-MIdAS) Dates: 2012-2017 Funding organisation: Engineering and Physical Sciences Research Council (EPSRC) Grant no.: EP/J003077/1

2. DATASET

Title: EEG from a Brain-Computer Music Interface for controlling music tempo.

Description: This dataset accompanies the publication by Daly et al. (2018) and has been analysed in Daly et al. (2014a; 2014b) (please see Section 5 for full references). The dataset is obtained from a music-based Brain-Computer Interface constructed to allow users to modulate the tempo of a piece of music dynamically via intentional control. The dataset contains the electroencephalogram (EEG) data from 19 healthy participants instructed to increase the tempo of the music via kinaesthetically imagining squeezing a ball in their right hand or decrease the tempo by relaxing. The paradigm was split into 9 runs. The first was a calibration run, containing 30 trials in pairs of increase and decrease tempo trials.

Publication Year: 2018

Creators: Nicoletta Nicolaou, Ian Daly

Contributors: Isil Poyraz Bilgin, James Weaver, Asad Malik, Alexis Kirke, Duncan Williams.

Principal Investigator: Slawomir Nasuto (EP/J003077/1).

Co-Investigator: Eduardo Miranda (EP/J002135/1).

Organisation: University of Reading

Rights-holders: University of Reading

Source: The synthetic generator used to generate the music clips was presented in Williams et al., “Affective Calibration of Musical Feature Sets in an Emotionally Intelligent Music Composition System”, ACM Trans. Appl. Percept. 14, 3, Article 17 (May 2017), 13 pages. DOI: https://doi.org/10.1145/3059005

3. TERMS OF USE

Copyright University of Reading, 2018. This dataset is licensed by the rights-holder(s) under a Creative Commons Attribution 4.0 International Licence: https://creativecommons.org/licenses/by/4.0/.

4. CONTENTS

Zip File listing: The dataset comprises data from 19 subjects.

The data is provided in BIDS format. The sampling rate is 1 kHz and the EEG corresponding to a music clip is 20 s long (the duration of the clips).

5. METHOD and PROCESSING

This information is available in the following publications:

[1] Daly, I., � ��, Dataset paper, 2018. [2] Daly, I., Hallowell, J., Hwang, F., Kirke, A., Malik, A., Roesch, E., Weaver, J., Williams, D., Miranda, E. R., Nasuto, S. J., �Changes in music tempo entrain movement related brain activity�, in Proc. 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC’14), Chicago, Illinois, USA; pp. , 2014a.

[3] Daly, I., Williams, D., Hwang, F., Kirke, A., Malik, A., Roesch, E., Weaver, J., Miranda, E. R., Nasuto, S. J., �Brain-computer music interfacing for continuous control of musical tempo�, in Proc. 6th International Brain-Computer Interface Conference 2014, Graz, Austria; 2014b

Please cite these references and the reference to the music generator if you use this dataset in your study.

Thank you for your interest in our work.

Dataset Information#

Dataset ID

DS002720

Title

A dataset recorded during development of a tempo-based brain-computer music interface

Year

2020

Authors

Ian Daly, Nicoletta Nicolaou, Duncan Williams, Faustina Hwang, Alexis Kirke, Eduardo Miranda, Slawomir J. Nasuto

License

CC0

Citation / DOI

10.18112/openneuro.ds002720.v1.0.1

Source links

OpenNeuro | NeMAR | Source URL

Copy-paste BibTeX
@dataset{ds002720,
  title = {A dataset recorded during development of a tempo-based brain-computer music interface},
  author = {Ian Daly and Nicoletta Nicolaou and Duncan Williams and Faustina Hwang and Alexis Kirke and Eduardo Miranda and Slawomir J. Nasuto},
  doi = {10.18112/openneuro.ds002720.v1.0.1},
  url = {https://doi.org/10.18112/openneuro.ds002720.v1.0.1},
}

Found an issue with this dataset?

If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!

Report an Issue on GitHub

Technical Details#

Subjects & recordings
  • Subjects: 18

  • Recordings: 828

  • Tasks: 1

Channels & sampling rate
  • Channels: 19

  • Sampling rate (Hz): 1000.0

  • Duration (hours): 0.0

Tags
  • Pathology: Not specified

  • Modality: —

  • Type: —

Files & format
  • Size on disk: 2.4 GB

  • File count: 828

  • Format: BIDS

License & citation
  • License: CC0

  • DOI: 10.18112/openneuro.ds002720.v1.0.1

Provenance

API Reference#

Use the DS002720 class to access this dataset programmatically.

class eegdash.dataset.DS002720(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#

Bases: EEGDashDataset

OpenNeuro dataset ds002720. Modality: eeg; Experiment type: Affect; Subject type: Healthy. Subjects: 18; recordings: 165; tasks: 10.

Parameters:
  • cache_dir (str | Path) – Directory where data are cached locally.

  • query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key dataset.

  • s3_bucket (str | None) – Base S3 bucket used to locate the data.

  • **kwargs (dict) – Additional keyword arguments forwarded to EEGDashDataset.

data_dir#

Local dataset cache directory (cache_dir / dataset_id).

Type:

Path

query#

Merged query with the dataset filter applied.

Type:

dict

records#

Metadata records used to build the dataset, if pre-fetched.

Type:

list[dict] | None

Notes

Each item is a recording; recording-level metadata are available via dataset.description. query supports MongoDB-style filters on fields in ALLOWED_QUERY_FIELDS and is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.

References

OpenNeuro dataset: https://openneuro.org/datasets/ds002720 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds002720

Examples

>>> from eegdash.dataset import DS002720
>>> dataset = DS002720(cache_dir="./data")
>>> recording = dataset[0]
>>> raw = recording.load()
__init__(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
save(path, overwrite=False)[source]#

Save the dataset to disk.

Parameters:
  • path (str or Path) – Destination file path.

  • overwrite (bool, default False) – If True, overwrite existing file.

Return type:

None

See Also#