DS005662#
A high-quality EEG dataset for studying visual touch perception
Access recordings and metadata through EEGDash.
Citation: Sophie Smit, Almudena Ramírez-Haro, Manuel Varlet, Denise Moerel, Genevieve L. Quek, Tijl Grootswagers (2024). A high-quality EEG dataset for studying visual touch perception. 10.18112/openneuro.ds005662.v2.0.0
Modality: eeg Subjects: 80 Recordings: 175 License: CC0 Source: openneuro
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import DS005662
dataset = DS005662(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = DS005662(cache_dir="./data", subject="01")
Advanced query
dataset = DS005662(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{ds005662,
title = {A high-quality EEG dataset for studying visual touch perception},
author = {Sophie Smit and Almudena Ramírez-Haro and Manuel Varlet and Denise Moerel and Genevieve L. Quek and Tijl Grootswagers},
doi = {10.18112/openneuro.ds005662.v2.0.0},
url = {https://doi.org/10.18112/openneuro.ds005662.v2.0.0},
}
About This Dataset#
Data collection took place at The MARCS Institute for Brain, Behaviour and Development in Sydney, Australia. The study was approved by the Western Sydney University Ethics Committee. We recorded EEG data while participants viewed rapid streams of videos adapted from the Validated Touch-Video Database (Smit & Rich, 2025) depicting touch to a hand. Both the adapted videos used in this project, and original videos and validation data, are available on OSF (https://osf.io/jvkqa/). There were 32 sequences in total with a total of 2880 non-target trials (90 unique videos, each presented 8 times) alongside a variable number of target trials (showing touch to an object). Between trials there was an inter-trial-interval of 200ms. The experimental task lasted approximately 55 minutes including breaks. We also recorded questionnaire responses. Whole brain 64-channel EEG data were recorded using an Active Two Biosemi system (Biosemi, Inc.) at 2048Hz and 10-20 standard caps. Stimuli were presented using Python and PsychoPy software version 2023.3.1.
Dataset Information#
Dataset ID |
|
Title |
A high-quality EEG dataset for studying visual touch perception |
Year |
2024 |
Authors |
Sophie Smit, Almudena Ramírez-Haro, Manuel Varlet, Denise Moerel, Genevieve L. Quek, Tijl Grootswagers |
License |
CC0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{ds005662,
title = {A high-quality EEG dataset for studying visual touch perception},
author = {Sophie Smit and Almudena Ramírez-Haro and Manuel Varlet and Denise Moerel and Genevieve L. Quek and Tijl Grootswagers},
doi = {10.18112/openneuro.ds005662.v2.0.0},
url = {https://doi.org/10.18112/openneuro.ds005662.v2.0.0},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 80
Recordings: 175
Tasks: 1
Channels: 65
Sampling rate (Hz): 2048.0
Duration (hours): 0.0
Pathology: Healthy
Modality: Visual
Type: Perception
Size on disk: 107.8 GB
File count: 175
Format: BIDS
License: CC0
DOI: doi:10.18112/openneuro.ds005662.v2.0.0
API Reference#
Use the DS005662 class to access this dataset programmatically.
- class eegdash.dataset.DS005662(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetOpenNeuro dataset
ds005662. Modality:eeg; Experiment type:Perception; Subject type:Healthy. Subjects: 80; recordings: 80; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/ds005662 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds005662
Examples
>>> from eegdash.dataset import DS005662 >>> dataset = DS005662(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset