DS007537: eeg dataset, 23 subjects#
A multimodal dataset of EEG, eye-tracking, and physiological signals during smartphone interaction
Access recordings and metadata through EEGDash.
Citation: Prakash Mishra, Tapan K. Gandhi, Saurabh R. Gandhi (2026). A multimodal dataset of EEG, eye-tracking, and physiological signals during smartphone interaction. 10.18112/openneuro.ds007537.v1.0.0
Modality: eeg Subjects: 23 Recordings: 23 License: CC0 Source: openneuro
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import DS007537
dataset = DS007537(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = DS007537(cache_dir="./data", subject="01")
Advanced query
dataset = DS007537(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{ds007537,
title = {A multimodal dataset of EEG, eye-tracking, and physiological signals during smartphone interaction},
author = {Prakash Mishra and Tapan K. Gandhi and Saurabh R. Gandhi},
doi = {10.18112/openneuro.ds007537.v1.0.0},
url = {https://doi.org/10.18112/openneuro.ds007537.v1.0.0},
}
About This Dataset#
This dataset contains multimodal physiological recordings acquired during smartphone interaction and video viewing conditions. The dataset includes simultaneous electroencephalography (EEG), eye-tracking, photoplethysmography (PPG), and galvanic skin response (GSR) signals.
Experimental Protocol
Participants complete two experimental conditions while wearing a 64-channel EEG cap and a head-mounted eye tracker, with simultaneous PPG and GSR recordings: Smartphone Monitor condition (10 min): Participants engage in naturalistic smartphone use using their personal devices. They are instructed to interact freely with one of their most frequently used applications (e.g., browsing, reading, or app interaction). No constraints are imposed on interaction style. Video Monitor condition (5 min): Participants view a standardized video presented on a monitor.
Participants
Number of participants: 23 (sub-01 to sub-23) Participants are healthy adults. Detailed demographic and experimental information (age, sex, and smartphone application type) is provided on participants.tsv at the root of the dataset.
Hardware Synchronization and Event Markers
Session boundaries are marked using hardware-based transistor–transistor logic (TTL) synchronization pulses. Five TTL pulses are delivered at ~1-second intervals to mark session onset and offset. Additional TTL pulses are generated every 20 seconds during the session. These triggers are embedded in both EEG and eye-tracking data streams, enabling precise temporal alignment across all modalities. EEG event markers are defined for two conditions: Smartphone Monitor (onset: S 12; offset: S 13) and Video Monitor (onset: S 22; offset: S 23). In the eye-tracking data, session boundaries are identified using TTL bursts consisting of five pulses occuring at ~1-second intervals with pulse characteristics (Type: sig; Direction: in; Value: 1.0). Additional synchronization pulses are present at ~20-second intervals during sessions.
Dataset Information#
Dataset ID |
|
Title |
A multimodal dataset of EEG, eye-tracking, and physiological signals during smartphone interaction |
Author (year) |
— |
Canonical |
— |
Importable as |
|
Year |
2026 |
Authors |
Prakash Mishra, Tapan K. Gandhi, Saurabh R. Gandhi |
License |
CC0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{ds007537,
title = {A multimodal dataset of EEG, eye-tracking, and physiological signals during smartphone interaction},
author = {Prakash Mishra and Tapan K. Gandhi and Saurabh R. Gandhi},
doi = {10.18112/openneuro.ds007537.v1.0.0},
url = {https://doi.org/10.18112/openneuro.ds007537.v1.0.0},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 23
Recordings: 23
Tasks: 1
Channels: 66
Sampling rate (Hz): 1000.0
Duration (hours): 6.638899166666667
Pathology: Not specified
Modality: —
Type: —
Size on disk: 6.1 GB
File count: 23
Format: BIDS
License: CC0
DOI: doi:10.18112/openneuro.ds007537.v1.0.0
API Reference#
Use the DS007537 class to access this dataset programmatically.
- class eegdash.dataset.DS007537(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetA multimodal dataset of EEG, eye-tracking, and physiological signals during smartphone interaction
- Study:
ds007537(OpenNeuro)- Author (year):
nan- Canonical:
—
Also importable as:
DS007537,nan.Modality:
eeg. Subjects: 23; recordings: 23; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/ds007537 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds007537 DOI: https://doi.org/10.18112/openneuro.ds007537.v1.0.0
Examples
>>> from eegdash.dataset import DS007537 >>> dataset = DS007537(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset