NM000345: eeg dataset, 12 subjects#

CastillosBurstVEP40

Access recordings and metadata through EEGDash.

Citation: Kalou Cabrera Castillos, Simon Ladouce, Ludovic Darmet, Frédéric Dehais (2023). CastillosBurstVEP40. 10.1016/j.neuroimage.2023.120446

Modality: eeg Subjects: 12 Recordings: 12 License: CC-BY-4.0 Source: nemar

Metadata: Complete (100%)

Quickstart#

Install

pip install eegdash

Access the data

from eegdash.dataset import NM000345

dataset = NM000345(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)

Filter by subject

dataset = NM000345(cache_dir="./data", subject="01")

Advanced query

dataset = NM000345(
    cache_dir="./data",
    query={"subject": {"$in": ["01", "02"]}},
)

Iterate recordings

for rec in dataset:
    print(rec.subject, rec.raw.info['sfreq'])

If you use this dataset in your research, please cite the original authors.

BibTeX

@dataset{nm000345,
  title = {CastillosBurstVEP40},
  author = {Kalou Cabrera Castillos and Simon Ladouce and Ludovic Darmet and Frédéric Dehais},
  doi = {10.1016/j.neuroimage.2023.120446},
  url = {https://doi.org/10.1016/j.neuroimage.2023.120446},
}

About This Dataset#

CastillosBurstVEP40

c-VEP and Burst-VEP dataset from Castillos et al. (2023)

Dataset Overview

Code: CastillosBurstVEP40 Paradigm: cvep DOI: https://doi.org/10.1016/j.neuroimage.2023.120446

View full README

CastillosBurstVEP40

c-VEP and Burst-VEP dataset from Castillos et al. (2023)

Dataset Overview

Code: CastillosBurstVEP40 Paradigm: cvep DOI: https://doi.org/10.1016/j.neuroimage.2023.120446 Subjects: 12 Sessions per subject: 1 Events: 0=100, 1=101 Trial interval: (0, 0.25) s File format: EEGLAB .set Number of contributing labs: 1

Acquisition

Sampling rate: 500.0 Hz Number of channels: 32 Channel types: eeg=32 Channel names: C3, C4, CP1, CP2, CP5, CP6, Cz, F10, F3, F4, F7, F8, F9, FC1, FC2, FC5, FC6, Fp1, Fp2, Fz, O1, O2, Oz, P10, P3, P4, P7, P8, P9, Pz, T7, T8 Montage: standard_1020 Hardware: BrainProducts LiveAmp 32 Reference: FCz Ground: FPz Sensor type: eeg Line frequency: 50.0 Hz Online filters: {‘line_noise’: ‘IIR cut-band filter between 49.9 and 50.1 Hz of order 16’} Impedance threshold: 25.0 kOhm Cap manufacturer: BrainProducts Cap model: Acticap Electrode type: active

Participants

Number of subjects: 12 Health status: healthy Age: mean=30.6, std=7.1 Gender distribution: female=4, male=8 Species: human

Experimental Protocol

Paradigm: cvep Task type: reactive BCI Number of classes: 2 Class labels: 0, 1 Trial duration: 2.2 s Tasks: attend to cued target Study design: factorial design Study domain: brain-computer interface Feedback type: none Stimulus type: aperiodic visual flashes Stimulus modalities: visual Primary modality: visual Synchronicity: synchronous Mode: offline Training/test split: False Instructions: Participants were instructed to focus on c-VEP targets cued sequentially Stimulus presentation: screen=Dell P2419HC, 1920 × 1080 pixels, 265 cd/m2, 60 Hz

HED Event Annotations

Schema: HED 8.4.0 | Browse: https://www.hedtags.org/hed-schema-browser 0

     ├─ Sensory-event
     ├─ Experimental-stimulus
     ├─ Visual-presentation
     └─ Label/intensity_0

1
├─ Sensory-event
├─ Experimental-stimulus
├─ Visual-presentation
└─ Label/intensity_1

Paradigm-Specific Parameters

Detected paradigm: cvep Stimulus frequencies: [2.0, 3.0, 4.0] Hz Code type: burst Number of targets: 4 Cue duration: 0.5 s

Data Structure

Trials: 60 Blocks per session: 15 Trials context: 15 blocks x 4 trials per block = 60 trials per subject for burst c-VEP at 40% amplitude

Preprocessing

Data state: raw

Signal Processing

Classifiers: CNN, Convolutional Neural Network Feature extraction: EEG2Code bitwise decoding

Cross-Validation

Evaluation type: offline

Performance (Original Study)

Accuracy: 95.6% Burst 100 Accuracy 17.6S Calibration: 90.5 Burst 100 Accuracy 52.8S Calibration: 95.6 Mseq 100 Accuracy 17.6S Calibration: 71.4 Mseq 100 Accuracy 52.8S Calibration: 85.0 Burst 40 Accuracy: 94.2 Mean Selection Time S: 1.5

BCI Application

Applications: brain-computer interface Environment: laboratory Online feedback: False

Tags

Pathology: Healthy Modality: EEG Type: reactive BCI, c-VEP

Documentation

Description: Burst c-VEP based BCI study optimizing stimulus design for enhanced classification with minimal calibration data and improved user experience. The study introduces an innovative variant of code-VEP called ‘Burst c-VEP’ involving short bursts of aperiodic visual flashes at 2-4 flashes per second. DOI: 10.1016/j.neuroimage.2023.120446 Associated paper DOI: 10.1016/j.neuroimage.2023.120446 License: CC-BY-4.0 Investigators: Kalou Cabrera Castillos, Simon Ladouce, Ludovic Darmet, Frédéric Dehais Senior author: Frédéric Dehais Contact: kalou.cabrera-castillos@isae-supaero.fr Institution: Institut Supérieur de l’Aéronautique et de l’Espace (ISAE-SUPAERO) Department: Human Factors and Neuroergonomics Address: 10 Av. Edouard Belin, Toulouse, 31400, France Country: FR Repository: Zenodo Data URL: https://zenodo.org/record/8255618 Publication year: 2023 Ethics approval: University of Toulouse ethics committee (CER approval number 2020-334); Declaration of Helsinki Keywords: Code-VEP, Reactive BCI, CNN, Amplitude depth reduction, Visual comfort

External Links

Abstract

The utilization of aperiodic flickering visual stimuli under the form of code-modulated Visual Evoked Potentials (c-VEP) represents a pivotal advancement in the field of reactive Brain–Computer Interface (rBCI). A major advantage of the c-VEP approach is that the training of the model is independent of the number and complexity of targets, which helps reduce calibration time. Nevertheless, the existing designs of c-VEP stimuli can be further improved in terms of visual user experience but also to achieve a higher signal-to-noise ratio, while shortening the selection time and calibration process. In this study, we introduce an innovative variant of code-VEP, referred to as ‘Burst c-VEP’. This original approach involves the presentation of short bursts of aperiodic visual flashes at a deliberately slow rate, typically ranging from two to four flashes per second. The rationale behind this design is to leverage the sensitivity of the primary visual cortex to transient changes in low-level stimuli features to reliably elicit distinctive series of visual evoked potentials. In comparison to other types of faster-paced code sequences, burst c-VEP exhibit favorable properties to achieve high bitwise decoding performance using convolutional neural networks (CNN), which yields potential to attain faster selection time with the need for less calibration data. Furthermore, our investigation focuses on reducing the perceptual saliency of c-VEP through the attenuation of visual stimuli contrast and intensity to significantly improve users’ visual comfort. The proposed solutions were tested through an offline 4-classes c-VEP protocol involving 12 participants. Following a factorial design, participants were instructed to focus on c-VEP targets whose pattern (burst and maximum-length sequences) and amplitude (100% or 40% amplitude depth modulations) were manipulated across experimental conditions. Firstly, the full amplitude burst c-VEP sequences exhibited higher accuracy, ranging from 90.5% (with 17.6 s of calibration data) to 95.6% (with 52.8 s of calibration data), compared to its m-sequence counterpart (71.4% to 85.0%). The mean selection time for both types of codes (1.5 s) compared favorably to reports from previous studies. Secondly, our findings revealed that lowering the intensity of the stimuli only slightly decreased the accuracy of the burst code sequences to 94.2% while leading to substantial improvements in terms of user experience. Taken together, these results demonstrate the high potential of the proposed burst codes to advance reactive BCI both in terms of performance and usability. The collected dataset, along with the proposed CNN architecture implementation, are shared through open-access repositories.

Methodology

Factorial experimental design with 12 participants. Four conditions: burst or m-sequence codes × 100% or 40% amplitude depth. Participants attended to cued targets presented as aperiodic visual flashes. Burst codes: 50ms flashes at 2-4 Hz with 200ms minimum inter-burst interval. M-sequences: pseudo-random binary sequences at ~10 Hz. EEG recorded at 500 Hz using 32-channel BrainProduct LiveAmp. Analysis on occipital/parietal electrodes. CNN-based bitwise decoding (improved EEG2Code architecture). Each participant completed 15 blocks of 4 trials per condition (60 trials per class, 240 total trials). Trial structure: 700ms ITI, 500ms cue, 2200ms stimulation. Display: Dell P2419HC 60Hz LCD. Luminance: medium grey background (124 lux), 100% condition (168 lux), 40% condition (142 lux). Preprocessing: average re-reference, 50Hz notch filter (IIR order 16), epoching 0-2.2s, baseline removal. Subjective assessments of visual comfort, tiredness, and intrusiveness collected.

References

Kalou Cabrera Castillos. (2023). 4-class code-VEP EEG data [Data set]. Zenodo.(dataset). DOI: https://doi.org/10.5281/zenodo.8255618 Kalou Cabrera Castillos, Simon Ladouce, Ludovic Darmet, Frédéric Dehais. Burst c-VEP Based BCI: Optimizing stimulus design for enhanced classification with minimal calibration data and improved user experience,NeuroImage,Volume 284, 2023,120446,ISSN 1053-8119 DOI: https://doi.org/10.1016/j.neuroimage.2023.120446 Notes .. versionadded:: 1.1.0 Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Hochenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896 Pernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103. https://doi.org/10.1038/s41597-019-0104-8 Generated by MOABB 1.5.0 (Mother of All BCI Benchmarks) https://github.com/NeuroTechX/moabb

Dataset Information#

Dataset ID

NM000345

Title

CastillosBurstVEP40

Author (year)

Castillos2023_CastillosBurstVEP40

Canonical

Importable as

NM000345, Castillos2023_CastillosBurstVEP40

Year

2023

Authors

Kalou Cabrera Castillos, Simon Ladouce, Ludovic Darmet, Frédéric Dehais

License

CC-BY-4.0

Citation / DOI

doi:10.1016/j.neuroimage.2023.120446

Source links

OpenNeuro | NeMAR | Source URL

Copy-paste BibTeX
@dataset{nm000345,
  title = {CastillosBurstVEP40},
  author = {Kalou Cabrera Castillos and Simon Ladouce and Ludovic Darmet and Frédéric Dehais},
  doi = {10.1016/j.neuroimage.2023.120446},
  url = {https://doi.org/10.1016/j.neuroimage.2023.120446},
}

Found an issue with this dataset?

If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!

Report an Issue on GitHub

Technical Details#

Subjects & recordings
  • Subjects: 12

  • Recordings: 12

  • Tasks: 1

Channels & sampling rate
  • Channels: 32

  • Sampling rate (Hz): 500.0

  • Duration (hours): 0.8422155555555555

Tags
  • Pathology: Healthy

  • Modality: Visual

  • Type: Attention

Files & format
  • Size on disk: 144.2 MB

  • File count: 12

  • Format: BIDS

License & citation
  • License: CC-BY-4.0

  • DOI: doi:10.1016/j.neuroimage.2023.120446

Provenance

API Reference#

Use the NM000345 class to access this dataset programmatically.

class eegdash.dataset.NM000345(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#

Bases: EEGDashDataset

CastillosBurstVEP40

Study:

nm000345 (NeMAR)

Author (year):

Castillos2023_CastillosBurstVEP40

Canonical:

Also importable as: NM000345, Castillos2023_CastillosBurstVEP40.

Modality: eeg; Experiment type: Attention; Subject type: Healthy. Subjects: 12; recordings: 12; tasks: 1.

Parameters:
  • cache_dir (str | Path) – Directory where data are cached locally.

  • query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key dataset.

  • s3_bucket (str | None) – Base S3 bucket used to locate the data.

  • **kwargs (dict) – Additional keyword arguments forwarded to EEGDashDataset.

data_dir#

Local dataset cache directory (cache_dir / dataset_id).

Type:

Path

query#

Merged query with the dataset filter applied.

Type:

dict

records#

Metadata records used to build the dataset, if pre-fetched.

Type:

list[dict] | None

Notes

Each item is a recording; recording-level metadata are available via dataset.description. query supports MongoDB-style filters on fields in ALLOWED_QUERY_FIELDS and is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.

References

OpenNeuro dataset: https://openneuro.org/datasets/nm000345 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=nm000345 DOI: https://doi.org/10.1016/j.neuroimage.2023.120446

Examples

>>> from eegdash.dataset import NM000345
>>> dataset = NM000345(cache_dir="./data")
>>> recording = dataset[0]
>>> raw = recording.load()
__init__(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
save(path, overwrite=False)[source]#

Save the dataset to disk.

Parameters:
  • path (str or Path) – Destination file path.

  • overwrite (bool, default False) – If True, overwrite existing file.

Return type:

None

See Also#