NM000175: fnirs dataset, 5 subjects#
fNIRS Finger Tapping
Access recordings and metadata through EEGDash.
Citation: Robert Luke, Eric Larson, Alexandre Gramfort, Macquarie University (—). fNIRS Finger Tapping.
Modality: fnirs Subjects: 5 Recordings: 5 License: CC0 Source: nemar
Metadata: Complete (90%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import NM000175
dataset = NM000175(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = NM000175(cache_dir="./data", subject="01")
Advanced query
dataset = NM000175(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{nm000175,
title = {fNIRS Finger Tapping},
author = {Robert Luke and Eric Larson and Alexandre Gramfort and Macquarie University},
}
About This Dataset#
BIDS fNIRS Example Dataset
DOI The fNIRS BIDS specification is a work in progress. Expect changes while the BEP is in developement. Example fNIRS dataset that is formated according to the BIDS specification. This repository provides an example dataset demonstrating how a BIDS dataset should be stored. And also demonstrates how to convert measurements obtained using a NIRx device to BIDS using MNE-BIDS (see branches below for script details).
Experiment Description
View full README
BIDS fNIRS Example Dataset
DOI The fNIRS BIDS specification is a work in progress. Expect changes while the BEP is in developement. Example fNIRS dataset that is formated according to the BIDS specification. This repository provides an example dataset demonstrating how a BIDS dataset should be stored. And also demonstrates how to convert measurements obtained using a NIRx device to BIDS using MNE-BIDS (see branches below for script details).
Experiment Description
This experiment examines how the motor cortex is activated during a finger tapping task. Participants are asked to either tap their left thumb to fingers, tap their right thumb to fingers, or nothing (control). Tapping lasts for 5 seconds as is propted by an auditory cue. Sensors are placed over the motor cortex as described in the montage section in the link below, short channels are attached to the scalp too. Further details about the experiment (including presentation code) can be found at rob-luke/experiment-fNIRS-tapping
Data Description
The dataset contains measurements from 5 participants. All details have been anonymised by hand in the raw data. Alternatively the anonymise argument could be used when writing the BIDS dataset.
How to use this repository
I have used branches in this repository to describe the steps taken to convert this data to the BIDS format. Using the GitHub interface you can select the branch you wish to view. The branches are… * 00-Raw-data: Contains just the raw recordings * 01-Raw-to-SNIRF: Converts the original data to snirf, but not BIDS. * 02-Raw-to-BIDS: Converts the original data to BIDS (or as close as can be automated, before manual editing and movement to master). * master: Dataset in BIDS format.
Branches 00 and 01 are only included for interested researchers. To generate the data in master use branch 02, then remove the sourcedata directory and manually enter the author in to dataset_description.json.
Dataset Information#
Dataset ID |
|
Title |
fNIRS Finger Tapping |
Author (year) |
|
Canonical |
— |
Importable as |
|
Year |
— |
Authors |
Robert Luke, Eric Larson, Alexandre Gramfort, Macquarie University |
License |
CC0 |
Citation / DOI |
Unknown |
Source links |
OpenNeuro | NeMAR | Source URL |
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 5
Recordings: 5
Tasks: 1
Channels: 56
Sampling rate (Hz): 7.8125
Duration (hours): 3.808533333333333
Pathology: Not specified
Modality: —
Type: —
Size on disk: 47.5 MB
File count: 5
Format: BIDS
License: CC0
DOI: —
API Reference#
Use the NM000175 class to access this dataset programmatically.
- class eegdash.dataset.NM000175(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetfNIRS Finger Tapping
- Study:
nm000175(NeMAR)- Author (year):
Luke2024- Canonical:
—
Also importable as:
NM000175,Luke2024.Modality:
fnirs. Subjects: 5; recordings: 5; tasks: 1.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/nm000175 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=nm000175
Examples
>>> from eegdash.dataset import NM000175 >>> dataset = NM000175(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset