eegdash.dataset.DS004830#

Spatial Attention Decoding using fNIRS During Complex Scene Analysis (OpenNeuro ds004830). Access recordings and metadata through EEGDash.

Modality: [‘fnirs’] Tasks: 0 License: CC0 Subjects: 0 Recordings: 0 Source: openneuro

Dataset Information#

Dataset ID

DS004830

Title

Spatial Attention Decoding using fNIRS During Complex Scene Analysis

Year

Unknown

Authors

Matthew Ning, Sudan Duwadi, Meryem A. Yucel, Alexander Von Luhmann, David A. Boas, Kamal Sen

License

CC0

Citation / DOI

doi:10.18112/openneuro.ds004830.v1.0.1

Source links

OpenNeuro | NeMAR | Source URL

Copy-paste BibTeX
@dataset{ds004830,
  title = {Spatial Attention Decoding using fNIRS During Complex Scene Analysis},
  author = {Matthew Ning and Sudan Duwadi and Meryem A. Yucel and Alexander Von Luhmann and David A. Boas and Kamal Sen},
  doi = {10.18112/openneuro.ds004830.v1.0.1},
  url = {https://doi.org/10.18112/openneuro.ds004830.v1.0.1},
}

Highlights#

Subjects & recordings
  • Subjects: 0

  • Recordings: 0

  • Tasks: 0

Channels & sampling rate
  • Channels: Unknown

  • Sampling rate (Hz): Unknown

  • Duration (hours): 0

Tasks & conditions
  • Tasks: 0

  • Experiment type: Unknown

  • Subject type: Unknown

Files & format
  • Size on disk: Unknown

  • File count: Unknown

  • Format: Unknown

License & citation
  • License: CC0

  • DOI: doi:10.18112/openneuro.ds004830.v1.0.1

Provenance

Quickstart#

Install

pip install eegdash

Load a recording

from eegdash.dataset import DS004830

dataset = DS004830(cache_dir="./data")
recording = dataset[0]
raw = recording.load()

Filter/query

dataset = DS004830(cache_dir="./data", subject="01")
dataset = DS004830(
    cache_dir="./data",
    query={"subject": {"$in": ["01", "02"]}},
)

Quality & caveats#

  • No dataset-specific caveats are listed in the available metadata.

API#

class eegdash.dataset.DS004830(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#

Bases: EEGDashDataset

OpenNeuro dataset ds004830. Modality: fnirs; Experiment type: Unknown; Subject type: Spatial Attention Decoding, Auditory Neuroscience, Complex Scene Analysis, fNIRS, BCI, Machine Learning. Subjects: 13; recordings: 226; tasks: 5.

Parameters:
  • cache_dir (str | Path) – Directory where data are cached locally.

  • query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key dataset.

  • s3_bucket (str | None) – Base S3 bucket used to locate the data.

  • **kwargs (dict) – Additional keyword arguments forwarded to EEGDashDataset.

data_dir#

Local dataset cache directory (cache_dir / dataset_id).

Type:

Path

query#

Merged query with the dataset filter applied.

Type:

dict

records#

Metadata records used to build the dataset, if pre-fetched.

Type:

list[dict] | None

Notes

Each item is a recording; recording-level metadata are available via dataset.description. query supports MongoDB-style filters on fields in ALLOWED_QUERY_FIELDS and is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.

References

OpenNeuro dataset: https://openneuro.org/datasets/ds004830 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds004830 DOI: https://doi.org/10.18112/openneuro.ds004830.v1.0.1

Examples

>>> from eegdash.dataset import DS004830
>>> dataset = DS004830(cache_dir="./data")
>>> recording = dataset[0]
>>> raw = recording.load()
__init__(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
save(path, overwrite=False)[source]#

Save the dataset to disk.

Parameters:
  • path (str or Path) – Destination file path.

  • overwrite (bool, default False) – If True, overwrite existing file.

Return type:

None

See Also#