NM000219: eeg dataset, 18 subjects#
BNCI 2020-002 Attention Shift (Covert Spatial Attention) dataset
Access recordings and metadata through EEGDash.
Citation: Christoph Reichert, Igor Fabian Tellez Ceja, Catherine M. Sweeney-Reed, Hans-Jochen Heinze, Hermann Hinrichs, Stefan Dürschmid (2020). BNCI 2020-002 Attention Shift (Covert Spatial Attention) dataset.
Modality: eeg Subjects: 18 Recordings: 18 License: CC-BY-4.0 Source: nemar
Metadata: Complete (90%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import NM000219
dataset = NM000219(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = NM000219(cache_dir="./data", subject="01")
Advanced query
dataset = NM000219(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{nm000219,
title = {BNCI 2020-002 Attention Shift (Covert Spatial Attention) dataset},
author = {Christoph Reichert and Igor Fabian Tellez Ceja and Catherine M. Sweeney-Reed and Hans-Jochen Heinze and Hermann Hinrichs and Stefan Dürschmid},
}
About This Dataset#
BNCI 2020-002 Attention Shift (Covert Spatial Attention) dataset
BNCI 2020-002 Attention Shift (Covert Spatial Attention) dataset.
Dataset Overview
Code: BNCI2020-002
Paradigm: p300
DOI: 10.3389/fnins.2020.591777
View full README
BNCI 2020-002 Attention Shift (Covert Spatial Attention) dataset
BNCI 2020-002 Attention Shift (Covert Spatial Attention) dataset.
Dataset Overview
Code: BNCI2020-002
Paradigm: p300
DOI: 10.3389/fnins.2020.591777
Subjects: 18
Sessions per subject: 1
Events: NonTarget=1, Target=2
Trial interval: [0, 16] s
File format: MAT
Acquisition
Sampling rate: 250.0 Hz
Number of channels: 30
Channel types: eeg=30, eog=2
Channel names: C3, C4, CP1, CP2, Cz, F3, F4, F7, F8, FC1, FC2, Fp1, Fp2, Fz, HEOG, IZ, LMAST, O10, O9, Oz, P3, P4, P7, P8, PO3, PO4, PO7, PO8, Pz, T7, T8, VEOG
Montage: extended 10-20
Hardware: BrainAmp DC Amplifier
Reference: right mastoid
Sensor type: Ag/AgCl electrodes
Line frequency: 50.0 Hz
Online filters: 0.1 Hz highpass
Cap manufacturer: Brain Products GmbH
Auxiliary channels: EOG (2 ch, horizontal, vertical)
Participants
Number of subjects: 18
Health status: healthy
Age: mean=27.0, min=19.0, max=38.0
Gender distribution: male=8, female=10
Species: human
Experimental Protocol
Paradigm: p300
Task type: binary decision
Number of classes: 2
Class labels: NonTarget, Target
Feedback type: visual (yes/no text)
Stimulus type: colored crosses (green + and red x)
Stimulus modalities: visual
Primary modality: visual
Synchronicity: synchronous
Mode: online
Training/test split: True
Instructions: Respond to yes/no questions by shifting attention to green cross (yes) or red cross (no) while maintaining central gaze fixation
Stimulus presentation: duration_ms=250, soa_ms=850 (jittered by 0-250 ms), stimuli_per_trial=10
HED Event Annotations
Schema: HED 8.4.0 | Browse: https://www.hedtags.org/hed-schema-browser
NonTarget
├─ Sensory-event
├─ Experimental-stimulus
├─ Visual-presentation
└─ Non-target
Target
├─ Sensory-event
├─ Experimental-stimulus
├─ Visual-presentation
└─ Target
Paradigm-Specific Parameters
Detected paradigm: p300
Number of targets: 2
Number of repetitions: 10
Stimulus onset asynchrony: 850.0 ms
Data Structure
Trials: 24
Blocks per session: 7
Trials context: per_block
Preprocessing
Data state: raw
Preprocessing applied: False
Steps: re-referenced to average of left and right mastoid, 4th order zero-phase IIR Butterworth bandpass filter (1.0-12.5 Hz), resampled to 50 Hz, epoched from stimulus onset to 750 ms after
Highpass filter: 1.0 Hz
Lowpass filter: 12.5 Hz
Bandpass filter: [1.0, 12.5]
Filter type: Butterworth IIR
Filter order: 4
Re-reference: average of left and right mastoid
Downsampled to: 50.0 Hz
Epoch window: [0.0, 0.75]
Signal Processing
Classifiers: Canonical Correlation Analysis (CCA)
Feature extraction: N2pc, ERP, Canonical difference waves
Spatial filters: CCA spatial filters
Cross-Validation
Method: leave-one-out cross-validation (LOOCV)
Evaluation type: within_subject
Performance (Original Study)
Accuracy: 88.5%
Itr: 3.02 bits/min
Std Accuracy: 7.8
Min Accuracy: 70.8
Max Accuracy: 90.3
BCI Application
Applications: communication, binary decision
Environment: laboratory
Online feedback: True
Tags
Pathology: Healthy
Modality: Visual
Type: Attention
Documentation
Description: Gaze-independent brain-computer interface based on covert spatial attention shifts for binary (yes/no) communication
DOI: 10.3389/fnins.2020.591777
Associated paper DOI: 10.3389/fnins.2020.591777
License: CC-BY-4.0
Investigators: Christoph Reichert, Igor Fabian Tellez Ceja, Catherine M. Sweeney-Reed, Hans-Jochen Heinze, Hermann Hinrichs, Stefan Dürschmid
Senior author: Stefan Dürschmid
Contact: christoph.reichert@lin-magdeburg.de
Institution: Leibniz Institute for Neurobiology
Department: Department of Behavioral Neurology
Address: Magdeburg, Germany
Country: Germany
Repository: BNCI Horizon
Publication year: 2020
Funding: German Ministry of Education and Research (BMBF) within the Research Campus STIMULATE under grant number 13GW0095D
Ethics approval: Ethics Committee of the Otto-von-Guericke University, Magdeburg
Keywords: visual spatial attention, brain-computer interface, stimulus features, N2pc, canonical correlation analysis, gaze-independent, BCI
References
Reichert, C., Tellez-Ceja, I. F., Schwenker, F., Rusnac, A.-L., Curio, G., Aust, L., & Hinrichs, H. (2020). Impact of Stimulus Features on the Performance of a Gaze-Independent Brain-Computer Interface Based on Covert Spatial Attention Shifts. Frontiers in Neuroscience, 14, 591777. https://doi.org/10.3389/fnins.2020.591777 Notes .. versionadded:: 1.3.0 This dataset uses a covert spatial attention paradigm with N2pc ERP detection, which is different from traditional P300 or motor imagery paradigms. The paradigm is designed for gaze-independent BCI control, making it suitable for users who cannot control eye movements. See Also BNCI2015_009 : AMUSE auditory spatial P300 paradigm BNCI2015_010 : RSVP visual P300 paradigm Examples
>> from moabb.datasets import BNCI2020_002 >>> dataset = BNCI2020_002() >>> dataset.subject_list [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18]
Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Hochenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896 Pernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103. https://doi.org/10.1038/s41597-019-0104-8 Generated by MOABB 1.5.0 (Mother of All BCI Benchmarks) https://github.com/NeuroTechX/moabb
Dataset Information#
Dataset ID |
|
Title |
BNCI 2020-002 Attention Shift (Covert Spatial Attention) dataset |
Author (year) |
|
Canonical |
|
Importable as |
|
Year |
2020 |
Authors |
Christoph Reichert, Igor Fabian Tellez Ceja, Catherine M. Sweeney-Reed, Hans-Jochen Heinze, Hermann Hinrichs, Stefan Dürschmid |
License |
CC-BY-4.0 |
Citation / DOI |
Unknown |
Source links |
OpenNeuro | NeMAR | Source URL |
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 18
Recordings: 18
Tasks: 1
Channels: 30
Sampling rate (Hz): 250.0
Duration (hours): 13.226646666666667
Pathology: Healthy
Modality: Visual
Type: Attention
Size on disk: 1023.6 MB
File count: 18
Format: BIDS
License: CC-BY-4.0
DOI: —
API Reference#
Use the NM000219 class to access this dataset programmatically.
- class eegdash.dataset.NM000219(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetBNCI 2020-002 Attention Shift (Covert Spatial Attention) dataset
- Study:
nm000219(NeMAR)- Author (year):
Reichert2020- Canonical:
BNCI2020,BNCI2020_002_AttentionShift,BNCI2020_002_CovertSpatialAttention
Also importable as:
NM000219,Reichert2020,BNCI2020,BNCI2020_002_AttentionShift,BNCI2020_002_CovertSpatialAttention.Modality:
eeg; Experiment type:Attention; Subject type:Healthy. Subjects: 18; recordings: 18; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/nm000219 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=nm000219
Examples
>>> from eegdash.dataset import NM000219 >>> dataset = NM000219(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset