Skip to content

Commit

Permalink
ENH: Allow automated compilation of metadata (mne-tools#8834)
Browse files Browse the repository at this point in the history
* ENH: Allow automated compilation of metadata

This adds mne.epochs.make_metadata(), which allows for the automated
construction of metadata for use in Epochs. The users passes an
events array, an event name -> event ID mapping, a time period and
sampling frequency. This function will then iterate over all events
specified in the mapping, determine which events fall into the specified
time period (relative to the currently time-locked event), and will
add them to the metadata.

* Apply suggestions from code review

Co-authored-by: Eric Larson <[email protected]>

* Allow passing params by position

* Fix typo & add versionadded

* Add more functionality

* Cleanup, remove HED redundancy in event names, small fixes

* Add ERP-CORE dataset

* Add tutorial

* Add topoplot to tutorial

* Fix doc build

* Add noqa: D103

* Remove cell marker

* Clarify keep_first docstring [skip azp][skip github]

* Phrasing [skip azp][skip github]

* Formatting [skip azp][skip github]

* Remove trailing space

* Rework docstring [skip azp][skip github]

* Improve event_id param docstring [skip azp][skip github]

* More docstring improvements [skip azp][skip github]

* Fix indentation [skip azp][skip github]

* Phrasing [skip azp][skip github]

* Misc tutorial improvements [skip azp][skip github]

* Improve tutorial [skip azp][skip github]

* Improve docstring [skip azp][skip github]

* Slightly alter tutorial flow [skip azp][skip github]

* Correct typos & phrasing, as suggested in code review [skip azp][skip github]

* Apply some suggestions by @drammock

Co-authored-by: Daniel McCloy <[email protected]>

* Move paragraph to Notes section [skip azp][skip github]

* time_locked_events -> row_events; time period -> time window; doc improvements

* Optimzations, use df.itertuples(), improve tests

* flake

* Apply suggestions from review by @cbrnr [skip github][skip azp]

Co-authored-by: Clemens Brunner <[email protected]>

* Remove cruft

* WIP: Implement changes as per discussion

* Apply a bunch of suggestions from code review by @drammock

Co-authored-by: Daniel McCloy <[email protected]>

* Adjust tutorial to use new column names

* Style & docstring fixes

* Bug fixes & fixes to tests

* Final fixes to toturial

* Add missing blank line

* Style

* Enable Circle pre-fetch of ERP CORE

* Add changelog entry [skip azp][skip github]

Co-authored-by: Eric Larson <[email protected]>
Co-authored-by: Daniel McCloy <[email protected]>
Co-authored-by: Clemens Brunner <[email protected]>
  • Loading branch information
4 people authored Mar 5, 2021
1 parent 0135fca commit 91f6081
Show file tree
Hide file tree
Showing 15 changed files with 895 additions and 10 deletions.
2 changes: 2 additions & 0 deletions doc/changes/latest.inc
Original file line number Diff line number Diff line change
Expand Up @@ -94,6 +94,8 @@ Enhancements

- :func:`mne.time_frequency.EpochsTFR.average` now allows different ways of averaging, such as "median", or callable functions (:gh:`8879` by `Adam Li`_)

- `~mne.Epochs` metadata can now be generated automatically from events using `mne.epochs.make_metadata` (:gh:`8834` by `Richard Höchenberger`_)

Bugs
~~~~
- Fix bug with `mne.connectivity.spectral_connectivity` where time axis in Epochs data object was dropped. (:gh:`8839` **by new contributor** |Anna Padee|_)
Expand Down
1 change: 1 addition & 0 deletions doc/datasets.rst
Original file line number Diff line number Diff line change
Expand Up @@ -39,3 +39,4 @@ Datasets
phantom_4dbti.data_path
refmeg_noise.data_path
ssvep.data_path
erp_core.data_path
1 change: 1 addition & 0 deletions doc/events.rst
Original file line number Diff line number Diff line change
Expand Up @@ -53,3 +53,4 @@ Events
average_movements
combine_event_ids
equalize_epoch_counts
make_metadata
26 changes: 26 additions & 0 deletions doc/overview/datasets_index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -409,6 +409,31 @@ discriminate.
and demonstrates how to fit a single trial linear regression using the
information contained in the metadata of the individual datasets.

.. _erp-core-dataset:

ERP CORE Dataset
^^^^^^^^^^^^^^^^
:func:`mne.datasets.erp_core.data_path`

The original `ERP CORE dataset`_ :footcite:`Kappenman2021` contains data from
40 participants who completed 6 EEG experiments, carefully crafted to evoke
7 well-known event-related potential (ERP) components.

Currently, the MNE-Python ERP CORE dataset only provides data from one
participant (subject ``001``) of the Flankers paradigm, which elicits the
lateralized readiness potential (LRP) and error-related negativity (ERN). The
data provided is **not** the original data from the ERP CORE dataset, but
rather a slightly modified version, designed to demonstrate the Epochs metadata
functionality. For example, we already set the references and montage
correctly, and stored events as Annotations. Data is provided in ``FIFF``
format.

.. topic:: Examples

* :ref:`tut-autogenerate-metadata`: Learn how to auto-generate
`~mne.Epochs` metadata, and visualize the error-related negativity (ERN)
ERP component.

.. _ssvep-dataset:

SSVEP
Expand Down Expand Up @@ -439,3 +464,4 @@ References
.. _resting state dataset tutorial: https://neuroimage.usc.edu/brainstorm/DatasetResting
.. _median nerve dataset tutorial: https://neuroimage.usc.edu/brainstorm/DatasetMedianNerveCtf
.. _SPM faces dataset: https://www.fil.ion.ucl.ac.uk/spm/data/mmfaces/
.. _ERP-CORE dataset: https://erpinfo.org/erp-core
13 changes: 13 additions & 0 deletions doc/references.bib
Original file line number Diff line number Diff line change
Expand Up @@ -2010,4 +2010,17 @@ @Article{Murray2008
publisher = {Springer Science and Business Media {LLC}},
}

@Article{Kappenman2021,
author = {Emily S. Kappenman and Jaclyn L. Farrens and Wendy Zhang and Andrew X. Stewart and Steven J. Luck},
journal = {{NeuroImage}},
title = {{ERP} {CORE}: An open resource for human event-related potential research},
year = {2021},
issn = {1053-8119},
month = {jan},
pages = {117465},
volume = {225},
doi = {10.1016/j.neuroimage.2020.117465},
publisher = {Elsevier {BV}},
}

@Comment{jabref-meta: databaseType:bibtex;}
5 changes: 3 additions & 2 deletions mne/datasets/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
from . import limo
from . import refmeg_noise
from . import ssvep
from . import erp_core
from .utils import (_download_all_example_data, fetch_hcp_mmp_parcellation,
fetch_aparc_sub_parcellation)
from ._fsaverage.base import fetch_fsaverage
Expand All @@ -34,6 +35,6 @@
'fetch_aparc_sub_parcellation', 'fetch_fsaverage', 'fetch_infant_template',
'fetch_hcp_mmp_parcellation', 'fieldtrip_cmc', 'hf_sef', 'kiloword',
'misc', 'mtrf', 'multimodal', 'opm', 'phantom_4dbti', 'sample',
'sleep_physionet', 'somato', 'spm_face', 'ssvep', 'testing', 'visual_92_categories',
'limo',
'sleep_physionet', 'somato', 'spm_face', 'ssvep', 'testing',
'visual_92_categories', 'limo', 'erp_core'
]
3 changes: 3 additions & 0 deletions mne/datasets/erp_core/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
"""ERP-CORE EEG dataset."""

from .erp_core import data_path, get_version
26 changes: 26 additions & 0 deletions mne/datasets/erp_core/erp_core.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
from functools import partial

from ...utils import verbose
from ..utils import (has_dataset, _data_path, _data_path_doc,
_get_version, _version_doc)

has_erp_core_data = partial(has_dataset, name='erp_core')


@verbose
def data_path(path=None, force_update=False, update_path=True,
download=True, verbose=None): # noqa: D103
return _data_path(path=path, force_update=force_update,
update_path=update_path, name='erp_core',
download=download)


data_path.__doc__ = _data_path_doc.format(name='erp_core',
conf='MNE_DATASETS_ERP_CORE_PATH')


def get_version(): # noqa: D103
return _get_version('erp_core')


get_version.__doc__ = _version_doc.format(name='erp_core')
18 changes: 13 additions & 5 deletions mne/datasets/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -247,6 +247,7 @@ def _data_path(path=None, force_update=False, update_path=True, download=True,
'limo': 'MNE_DATASETS_LIMO_PATH',
'refmeg_noise': 'MNE_DATASETS_REFMEG_NOISE_PATH',
'ssvep': 'MNE_DATASETS_SSVEP_PATH',
'erp_core': 'MNE_DATASETS_ERP_CORE_PATH'
}[name]

path = _get_path(path, key, name)
Expand Down Expand Up @@ -286,6 +287,7 @@ def _data_path(path=None, force_update=False, update_path=True, download=True,
phantom_4dbti='https://osf.io/v2brw/download?version=2',
refmeg_noise='https://osf.io/drt6v/download?version=1',
ssvep='https://osf.io/z8h6k/download?version=5',
erp_core='https://osf.io/rzgba/download?version=1'
)
# filename of the resulting downloaded archive (only needed if the URL
# name does not match resulting filename)
Expand All @@ -305,7 +307,8 @@ def _data_path(path=None, force_update=False, update_path=True, download=True,
'MNE-visual_92_categories-data-part2.tar.gz'],
phantom_4dbti='MNE-phantom-4DBTi.zip',
refmeg_noise='sample_reference_MEG_noise-raw.zip',
ssvep='ssvep_example_data.zip'
ssvep='ssvep_example_data.zip',
erp_core='MNE-ERP-CORE-data.tar.gz'
)
# original folder names that get extracted (only needed if the
# archive does not extract the right folder name; e.g., usually GitHub)
Expand All @@ -328,7 +331,8 @@ def _data_path(path=None, force_update=False, update_path=True, download=True,
fieldtrip_cmc='MNE-fieldtrip_cmc-data',
phantom_4dbti='MNE-phantom-4DBTi',
refmeg_noise='MNE-refmeg-noise-data',
ssvep='ssvep-example-data'
ssvep='ssvep-example-data',
erp_core='MNE-ERP-CORE-data'
)
md5_hashes = dict(
brainstorm=dict(
Expand All @@ -353,7 +357,8 @@ def _data_path(path=None, force_update=False, update_path=True, download=True,
fieldtrip_cmc='6f9fd6520f9a66e20994423808d2528c',
phantom_4dbti='938a601440f3ffa780d20a17bae039ff',
refmeg_noise='779fecd890d98b73a4832e717d7c7c45',
ssvep='af866bbc0f921114ac9d683494fe87d6'
ssvep='af866bbc0f921114ac9d683494fe87d6',
erp_core='5866c0d6213bd7ac97f254c776f6c4b1'
)
assert set(md5_hashes.keys()) == set(urls.keys())
url = urls[name]
Expand Down Expand Up @@ -583,7 +588,8 @@ def has_dataset(name):
'phantom_4dbti': 'MNE-phantom-4DBTi',
'mtrf': 'mTRF_1.5',
'refmeg_noise': 'MNE-refmeg-noise-data',
'ssvep': 'ssvep-example-data'
'ssvep': 'ssvep-example-data',
'erp_core': 'MNE-ERP-CORE-data'
}[name]
dp = _data_path(download=False, name=name, check_version=False,
archive_name=archive_name)
Expand All @@ -606,7 +612,7 @@ def _download_all_example_data(verbose=True):
eegbci, multimodal, opm, hf_sef, mtrf, fieldtrip_cmc,
kiloword, phantom_4dbti, sleep_physionet, limo,
fnirs_motor, refmeg_noise, fetch_infant_template,
fetch_fsaverage, ssvep)
fetch_fsaverage, ssvep, erp_core)
sample_path = sample.data_path()
testing.data_path()
misc.data_path()
Expand Down Expand Up @@ -640,6 +646,8 @@ def _download_all_example_data(verbose=True):
subjects_dir=sample_path + '/subjects', accept=True)
limo.load_data(subject=1, update_path=True)

erp_core.data_path()


@verbose
def fetch_aparc_sub_parcellation(subjects_dir=None, verbose=None):
Expand Down
Loading

0 comments on commit 91f6081

Please sign in to comment.