DS005628#
Dataset of Visual and Audiovisual Stimuli in Virtual Reality from the Edzna Archaeological Site
Access recordings and metadata through EEGDash.
Citation: Juan Pablo Rosado-Aíza, Fernando José Domínguez-Morales, Tania Yareni Pech-Canul, Paola Guadalupe Vázquez-Rodríguez, Gustavo Navas-Reascos, Luz María Alonso-Valerdi, David I. Ibarra Zarate (2024). Dataset of Visual and Audiovisual Stimuli in Virtual Reality from the Edzna Archaeological Site. 10.18112/openneuro.ds005628.v1.0.0
Modality: eeg Subjects: 102 Recordings: 1077 License: CC0 Source: openneuro
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import DS005628
dataset = DS005628(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = DS005628(cache_dir="./data", subject="01")
Advanced query
dataset = DS005628(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{ds005628,
title = {Dataset of Visual and Audiovisual Stimuli in Virtual Reality from the Edzna Archaeological Site},
author = {Juan Pablo Rosado-Aíza and Fernando José Domínguez-Morales and Tania Yareni Pech-Canul and Paola Guadalupe Vázquez-Rodríguez and Gustavo Navas-Reascos and Luz María Alonso-Valerdi and David I. Ibarra Zarate},
doi = {10.18112/openneuro.ds005628.v1.0.0},
url = {https://doi.org/10.18112/openneuro.ds005628.v1.0.0},
}
About This Dataset#
README
Authors
Juan Pablo Rosado-Aíza, Fernando José Domínguez-Morales, Tania Yareni Pech-Canul, Paola Guadalupe Vázquez-Rodríguez, Gustavo Navas-Reascos, Luz María Alonso-Valerdi, David I. Ibarra Zarate
Contact person
View full README
README
Authors
Juan Pablo Rosado-Aíza, Fernando José Domínguez-Morales, Tania Yareni Pech-Canul, Paola Guadalupe Vázquez-Rodríguez, Gustavo Navas-Reascos, Luz María Alonso-Valerdi, David I. Ibarra Zarate
Contact person
Gustavo Navas-Reascos https://orcid.org/0000-0003-0250-765X A01681952@tec.mx
Overview
Project name
Dataset of Visual and Audiovisual Stimuli in Virtual Reality from the Edzna Archaeological Site
Year that the project ran
2024
Brief overview
The purpose of this dataset is to analyze user experience in a virtual reality (VR) environment, focusing on a comparative study between visual and audiovisual stimuli based on the archaeological site of Edzna, Mexico. The immersive experience allowed participants to explore the site without needing to physically being there, and the experiment was conducted in a museum setting, offering a unique experience that goes beyond traditional visual-only exhibits. The dataset includes both Electroencephalography (EEG) recordings from eight channels (Fz, C3, Cz, C4, Pz, PO7, Oz, and PO8) and user responses to the User Experience Questionnaire (UEQ), providing necessary data for future studies on how immersive environments affect user perception. The EEG data was collected using a Unicorn Hybrid Black EEG system with a sampling rate of 250 Hz. Participants were exposed to two conditions: a visual-only stimulus and an audiovisual stimulus, both of which represented scenes from the archaeological site in VR. Prior to exposure, a baseline measurement was taken to capture the initial state of the participants. Data collection was conducted in MOSTLA, a digital innovation lab at Tecnologico de Monterrey campus, and the Museum of Contemporary Art in Monterrey.
Each EEG recording is shared in .set format and follows the BIDS structure. The recordings include eight channels of brainwave recordings for the baseline, visual, and audiovisual conditions. The signals are presented in both formats: raw and preprocessed. Additionally, an .xlsx file is provided with basic participant metadata, such as age, gender, unique identifier as well as the UEQ responses.
Each EEG file contains data segmented into the three phases of the experiment: baseline, visual stimulus, and audiovisual stimulus, allowing researchers to directly compare neural responses across conditions.
This dataset offers a comprehensive resource for researchers interested in investigating the effects of immersive VR environments on user engagement, and attention, making it highly applicable and useful.
Description of the contents of the dataset
sub-N - Raw data sub-Np - Preprocesed data
Example: sub-1 - Raw data of subject 1 sub-1p - Preprocesed data of subject 1
Subjects
A total of 51 participants were obtained.
Apparatus
Unicorn Hybrid Black EEG system VR Headset Headphones
Experimental location
MOSTLA place at Tecnologico de Monterrey. It is located at Av. Eugenio Garza Sada 2501 Sur, Tecnologico, 64849 Monterrey, N.L., Mexico. MARCO a contemporary art museum located in Monterrey at Zuazua y Jardón, Centro, 64000 Monterrey, N.L., Mexico.
Notes
All the metadata information, including the UEQ answers could be obtained from the file metadata.xlsx
The videos presented to the participants are shown in:
Audiovisual video: https://youtu.be/FBWbtSFwVuo
Visual video: https://youtu.be/aLzzl0ygBnc
Dataset Information#
Dataset ID |
|
Title |
Dataset of Visual and Audiovisual Stimuli in Virtual Reality from the Edzna Archaeological Site |
Year |
2024 |
Authors |
Juan Pablo Rosado-Aíza, Fernando José Domínguez-Morales, Tania Yareni Pech-Canul, Paola Guadalupe Vázquez-Rodríguez, Gustavo Navas-Reascos, Luz María Alonso-Valerdi, David I. Ibarra Zarate |
License |
CC0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{ds005628,
title = {Dataset of Visual and Audiovisual Stimuli in Virtual Reality from the Edzna Archaeological Site},
author = {Juan Pablo Rosado-Aíza and Fernando José Domínguez-Morales and Tania Yareni Pech-Canul and Paola Guadalupe Vázquez-Rodríguez and Gustavo Navas-Reascos and Luz María Alonso-Valerdi and David I. Ibarra Zarate},
doi = {10.18112/openneuro.ds005628.v1.0.0},
url = {https://doi.org/10.18112/openneuro.ds005628.v1.0.0},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 102
Recordings: 1077
Tasks: 1
Channels: 8
Sampling rate (Hz): 250.0
Duration (hours): 0.0
Pathology: Healthy
Modality: Multisensory
Type: Attention
Size on disk: 633.7 MB
File count: 1077
Format: BIDS
License: CC0
DOI: doi:10.18112/openneuro.ds005628.v1.0.0
API Reference#
Use the DS005628 class to access this dataset programmatically.
- class eegdash.dataset.DS005628(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetOpenNeuro dataset
ds005628. Modality:eeg; Experiment type:Attention; Subject type:Healthy. Subjects: 102; recordings: 306; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/ds005628 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds005628
Examples
>>> from eegdash.dataset import DS005628 >>> dataset = DS005628(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset