DS005185#
Ear-EEG Sleep Monitoring 2019 (EESM19)
Access recordings and metadata through EEGDash.
Citation: Kaare B. Mikkelsen, Preben Kidmose, Yousef Rezaei Tabar (2024). Ear-EEG Sleep Monitoring 2019 (EESM19). 10.18112/openneuro.ds005185.v1.0.2
Modality: eeg Subjects: 20 Recordings: 1654 License: CC0 Source: openneuro
Metadata: Complete (100%)
Quickstart#
Install
pip install eegdash
Access the data
from eegdash.dataset import DS005185
dataset = DS005185(cache_dir="./data")
# Get the raw object of the first recording
raw = dataset.datasets[0].raw
print(raw.info)
Filter by subject
dataset = DS005185(cache_dir="./data", subject="01")
Advanced query
dataset = DS005185(
cache_dir="./data",
query={"subject": {"$in": ["01", "02"]}},
)
Iterate recordings
for rec in dataset:
print(rec.subject, rec.raw.info['sfreq'])
If you use this dataset in your research, please cite the original authors.
BibTeX
@dataset{ds005185,
title = {Ear-EEG Sleep Monitoring 2019 (EESM19)},
author = {Kaare B. Mikkelsen and Preben Kidmose and Yousef Rezaei Tabar},
doi = {10.18112/openneuro.ds005185.v1.0.2},
url = {https://doi.org/10.18112/openneuro.ds005185.v1.0.2},
}
About This Dataset#
EESM19: Ear-EEG Sleep Monitoring data set
This data set was collected as part of development and quality assessment of the ear-EEG as a sleep monitoring platform. Data collection took place between 2018 and 2020. First publication was in 2019 (https://doi.org/10.1038/s41598-019-53115-3), hence the ‘19’ in the name.
The data set consists of 2 parts (a & b): a: 20 subjects who each spent 4 nights sleeping with a partial PSG (EEG, EOG and chin EMG electrodes), ear-EEG and a wristworn actigraph, in their own homes. b: Of these 20 subjects, 10 also slept a further 12 nights wearing only ear-EEG, actigraph and a single EOG electrode. Each night is saved as a separate ‘session’, meaning that some subjects have 4 sessions while others have 16. The PSG-nights area always sessions 1-4. Each PSG night has an additional ‘scoring’ event file, where ‘scoring’ is the ‘acquisition’ type.
Questionnaires: After each night’s recording, the subject answered a short questionnaire regarding the quality of the night’s sleep. This has been archived as behavioral data (task=’comfort’).
Diaries: Besides the comfort questionnaire, the subjects also kept a standardized diary regarding the events of the night. This have been imported too, however only the requried fields ‘Syncronization’,’Electrodetest’,’Went to bed’, ‘Lights out’ and ‘Got up’ have been translated from Danish to English. We suggest using an online translation tool for any additional entries. The diaries have a column ‘pressedTrigger’, which indicates that the subject marked the precise time of the event on their wrist worn actigraph. As there is some interpretation necessary due to both spurious extra trigger presses and also missing trigger presses, and these event markings eventually turned out not to be important for our own research, we have not exported these trigger times in the data set. However, as the full actigraphy file is included in this data set, any interested future user can do the matching themselves. For consistency, we have chosen to use the starting time written in the scored edf file (‘edf1’) as the starting time of each PSG recording. For non-PSG recordings, the starting time is what is written in the diary. An alternative would be using the start time as seen in the wrist actigraph, described below.
Actigraphy: Subjects wore GENEactive actigraphs (‘actiwatches’ for short). These record 3-axis acceleration as well as temperature, light and user button presses. Given that the temperature and light readings are very impacted by whether the subjects had their hand above or below the covers, we found that only the actigraphy and button presses had much use. However, all data is found in the actigraphy files (in the behavior folders). The ensure the possibility of perfect alignment between actiwatch and EEG recorder (TMSI ‘mobita’), at the beginning of each recording, the subjects shook the mobita and the actiwatch together in a repeated rythmical pattern. By accessing the mobita actigraphy data from the .set file (EEG.etc.acc.data) it is possible to get perfect alignment. This is advantageous if very high precision of various sleep events is desired, since the clock in in the actiwatch was very reliable. In practice, we have not used this option, and hence the actigraphy alignment is left up to the user.
Electrode test: As a quality check on the electrode connections subjects viewed a short video containing various instructions: repeated jaw clenching, open/closed eyes, horizontal eye movements. These are marked in the diaries, and can be used as a simple test that the EEG equipment is working as intended. An analysis of these responses can be found in https://doi.org/10.3389/fncom.2021.565244.
Note regarding artifact rejection: We advice against using the data directly from the .poly5 files. The primary reason for this is that we had some issues with faulty shielding on some of the electrodes (good shielding is necessary for dry-contact electrodes). This caused signal leakage between electrodes, which is highly unwanted, and which could make the ear-EEG channels contain PSG data, even after rereferencing. We went to great lengths to identify these electrodes, using both algorithms and physical inspection of all electrodes between recordings, and are confident that there are no issues in the .set files (for which these electrodes have been set to ‘NaN’). Note that that this identification and discarding is the only preprocessing which has been done to the EEG data.
For questions regarding this data set, contact: Kaare Mikkelsen, Mikkelsen.kaare@ece.au.dk, https://orcid.org/0000-0002-7360-8629
Dataset Information#
Dataset ID |
|
Title |
Ear-EEG Sleep Monitoring 2019 (EESM19) |
Year |
2024 |
Authors |
Kaare B. Mikkelsen, Preben Kidmose, Yousef Rezaei Tabar |
License |
CC0 |
Citation / DOI |
|
Source links |
OpenNeuro | NeMAR | Source URL |
Copy-paste BibTeX
@dataset{ds005185,
title = {Ear-EEG Sleep Monitoring 2019 (EESM19)},
author = {Kaare B. Mikkelsen and Preben Kidmose and Yousef Rezaei Tabar},
doi = {10.18112/openneuro.ds005185.v1.0.2},
url = {https://doi.org/10.18112/openneuro.ds005185.v1.0.2},
}
Found an issue with this dataset?
If you encounter any problems with this dataset (missing files, incorrect metadata, loading errors, etc.), please let us know!
Technical Details#
Subjects: 20
Recordings: 1654
Tasks: 4
Channels: 25
Sampling rate (Hz): 500.0
Duration (hours): 0.0
Pathology: Healthy
Modality: Sleep
Type: Sleep
Size on disk: 267.6 GB
File count: 1654
Format: BIDS
License: CC0
DOI: doi:10.18112/openneuro.ds005185.v1.0.2
API Reference#
Use the DS005185 class to access this dataset programmatically.
- class eegdash.dataset.DS005185(cache_dir: str, query: dict | None = None, s3_bucket: str | None = None, **kwargs)[source]#
Bases:
EEGDashDatasetOpenNeuro dataset
ds005185. Modality:eeg; Experiment type:Sleep; Subject type:Healthy. Subjects: 20; recordings: 356; tasks: 3.- Parameters:
cache_dir (str | Path) – Directory where data are cached locally.
query (dict | None) – Additional MongoDB-style filters to AND with the dataset selection. Must not contain the key
dataset.s3_bucket (str | None) – Base S3 bucket used to locate the data.
**kwargs (dict) – Additional keyword arguments forwarded to
EEGDashDataset.
- data_dir#
Local dataset cache directory (
cache_dir / dataset_id).- Type:
Path
- query#
Merged query with the dataset filter applied.
- Type:
dict
- records#
Metadata records used to build the dataset, if pre-fetched.
- Type:
list[dict] | None
Notes
Each item is a recording; recording-level metadata are available via
dataset.description.querysupports MongoDB-style filters on fields inALLOWED_QUERY_FIELDSand is combined with the dataset filter. Dataset-specific caveats are not provided in the summary metadata.References
OpenNeuro dataset: https://openneuro.org/datasets/ds005185 NeMAR dataset: https://nemar.org/dataexplorer/detail?dataset_id=ds005185
Examples
>>> from eegdash.dataset import DS005185 >>> dataset = DS005185(cache_dir="./data") >>> recording = dataset[0] >>> raw = recording.load()
See Also#
eegdash.dataset.EEGDashDataseteegdash.dataset