DS004514#
Simultaneous EEG and fNIRS recordings for semantic decoding of imagined animals and tools
Access recordings and metadata through EEGDash.
Citation: Milan Rybář, Riccardo Poli, Ian Daly (2023). Simultaneous EEG and fNIRS recordings for semantic decoding of imagined animals and tools. 10.18112/openneuro.ds004514.v1.1.2
Modality: eeg Subjects: 12 Recordings: 161 License: CC0 Source: openneuro
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import DS004514
dataset = DS004514(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = DS004514(cache_dir="./data", subject="01")
Advanced query
dataset = DS004514(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{ds004514,
title = {Simultaneous EEG and fNIRS recordings for semantic decoding of imagined animals and tools},
author = {Milan Rybář and Riccardo Poli and Ian Daly},
doi = {10.18112/openneuro.ds004514.v1.1.2},
url = {https://doi.org/10.18112/openneuro.ds004514.v1.1.2},
}
About This Dataset#
Description
This dataset contains simultaneous electroencephalography (EEG) and near-infrared spectroscopy (fNIRS) signals recorded from 12 participants while performing a silent naming task and three sensory-based imagery tasks using visual, auditory, and tactile perception. Participants were asked to visualize an object in their minds, imagine the sounds made by the object, and imagine the feeling of touching the object.
EEG
View full README
Description
This dataset contains simultaneous electroencephalography (EEG) and near-infrared spectroscopy (fNIRS) signals recorded from 12 participants while performing a silent naming task and three sensory-based imagery tasks using visual, auditory, and tactile perception. Participants were asked to visualize an object in their minds, imagine the sounds made by the object, and imagine the feeling of touching the object.
EEG
EEG data were acquired with a BioSemi ActiveTwo system with 64 electrodes positioned according to the international 10-20 system, plus one electrode on each earlobe as references (‘EXG1’ channel is the left ear electrode and ‘EXG2’ channel is the right ear electrode). Additionally, 2 electrodes placed on the left hand measured galvanic skin response (‘GSR1’ channel) and a respiration belt around the waist measured respiration (‘Resp’ channel). The sampling rate was 2048 Hz.
The electrode names were saved in a default BioSemi labeling scheme (A1-A32, B1-B32). See the Biosemi documentation for the corresponding international 10-20 naming scheme (https://www.biosemi.com/pics/cap_64_layout_medium.jpg, https://www.biosemi.com/headcap.htm).
For convenience, the following ordered channels
['A1', 'A2', 'A3', 'A4', 'A5', 'A6', 'A7', 'A8', 'A9', 'A10', 'A11', 'A12', 'A13', 'A14', 'A15', 'A16', 'A17', 'A18', 'A19', 'A20', 'A21', 'A22', 'A23', 'A24', 'A25', 'A26', 'A27', 'A28', 'A29', 'A30', 'A31', 'A32', 'B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B9', 'B10', 'B11', 'B12', 'B13', 'B14', 'B15', 'B16', 'B17', 'B18', 'B19', 'B20', 'B21', 'B22', 'B23', 'B24', 'B25', 'B26', 'B27', 'B28', 'B29', 'B30', 'B31', 'B32']
can thus be renamed to
['Fp1', 'AF7', 'AF3', 'F1', 'F3', 'F5', 'F7', 'FT7', 'FC5', 'FC3', 'FC1', 'C1', 'C3', 'C5', 'T7', 'TP7', 'CP5', 'CP3', 'CP1', 'P1', 'P3', 'P5', 'P7', 'P9', 'PO7', 'PO3', 'O1', 'Iz', 'Oz', 'POz', 'Pz', 'CPz', 'Fpz', 'Fp2', 'AF8', 'AF4', 'AFz', 'Fz', 'F2', 'F4', 'F6', 'F8', 'FT8', 'FC6', 'FC4', 'FC2', 'FCz', 'Cz', 'C2', 'C4', 'C6', 'T8', 'TP8', 'CP6', 'CP4', 'CP2', 'P2', 'P4', 'P6', 'P8', 'P10', 'PO8', 'PO4', 'O2']
fNIRS
fNIRS data were acquired with a NIRx NIRScoutXP continuous wave imaging system equipped with 4 light detectors, 8 light emitters (sources), and low-profile fNIRS optodes. Both electrodes and optodes were placed in a NIRx NIRScap for integrated fNIRS-EEG layouts. Two different montages were used: frontal and temporal, see references for more information.
Stimulus
Folder ‘stimuli’ contains all images of the semantic categories of animals and tools presented to participants.
Example code
We have prepared example scripts to demonstrate how to load the EEG and fNIRS data into Python using MNE and MNE-BIDS packages. These scripts are located in the ‘code’ directory.
References
This dataset was analyzed in the following publications:
[1] Rybář, M., Poli, R. and Daly, I., 2024. Using data from cue presentations results in grossly overestimating semantic BCI performance. Scientific Reports, 14(1), p.28003.
[2] Rybář, M., Poli, R. and Daly, I., 2021. Decoding of semantic categories of imagined concepts of animals and tools in fNIRS. Journal of Neural Engineering, 18(4), p.046035.
[3] Rybář, M., 2023. Towards EEG/fNIRS-based semantic brain-computer interfacing (Doctoral dissertation, University of Essex).
Dataset Information#
Dataset ID |
|
Title |
Simultaneous EEG and fNIRS recordings for semantic decoding of imagined animals and tools |
Year |
2023 |
Authors |
Milan Rybář, Riccardo Poli, Ian Daly |
License |
CC0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{ds004514,
title = {Simultaneous EEG and fNIRS recordings for semantic decoding of imagined animals and tools},
author = {Milan Rybář and Riccardo Poli and Ian Daly},
doi = {10.18112/openneuro.ds004514.v1.1.2},
url = {https://doi.org/10.18112/openneuro.ds004514.v1.1.2},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 12
Recordings: 161
Tasks: 2
Channels: 22 (12), 28 (12), 64 (12), 80 (12)
Sampling rate (Hz): 2048.0 (24), 7.8125 (12), 8.928571428571429 (12)
Duration (hours): 0.0
Pathology: Healthy
Modality: Multisensory
Type: Other
Size on disk: 24.1 GB
File count: 161
Format: BIDS
License: CC0
DOI: doi:10.18112/openneuro.ds004514.v1.1.2
API Reference#
Use the DS004514 class to access this dataset programmatically.
- class eegdash.dataset.DS004514(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetOpenNeuro dataset
ds004514. Modality:eeg, fnirs; Experiment type:Other; Subject type:Healthy. Subjects: 12; recordings: 24; tasks: 2.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/ds004514 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds004514
Examples
>>> from eegdash.dataset import DS004514 >>> dataset = DS004514(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset