-
Notifications
You must be signed in to change notification settings - Fork 84
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
7f2f442
commit fe3e236
Showing
666 changed files
with
94,918 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,42 @@ | ||
|
||
<img alt="NWB:N" src="https://neurodatawithoutborders.github.io/images/nwb_n_logo.png" width="400"> | ||
|
||
|
||
# Welcome to the NWB:N Tutorial at Cosyne 2019! | ||
|
||
The [Neurodata Without Borders: Neurophysiology (NWB:N)](https://neurodatawithoutborders.github.io/) team is holding a tutorial on the NWB:N data standard and on using [PyNWB](https://neurodatawithoutborders.github.io/pynwb) and [MatNWB](https://neurodatawithoutborders.github.io/matnwbemb) at the Cosyne 2019 Workshops. | ||
|
||
The [NWB:N project](https://neurodatawithoutborders.github.io/) is an effort to standardize the description and storage of neurophysiology data and metadata. NWB:N enables data sharing and reuse and reduces the energy-barrier to applying data analytics both within and across labs. NWB:N is more than just a file format but it defines an [ecosystem](https://neurodatawithoutborders.github.io/overview) of tools, methods, and standards for storing, sharing, and analyzing complex neurophysiology data. | ||
|
||
We recently released [NWB:N 2.0](https://neurodatawithoutborders.github.io/news), and we are excited to teach our user-base and potential new users about NWB:N and our software tools. | ||
|
||
## Dates and Location | ||
|
||
* **Date/Time:** March 4, 2019, 1-4pm | ||
* **Location:** Hotel Miragem Cascais, Av.Marginal n.8554 2754-536 Cascais, Portugal | ||
* **Room:** Sala XVI | ||
* A map of the conference rooms is available [here](https://www.cascaismirage.com/uploads/9/8/2/4/98249186/meeting_rooms_capacity_chart.pdf) | ||
|
||
## Registration | ||
|
||
Registration is still open! To attend the tutorial please complete the [registration form](https://goo.gl/forms/LAMXakJ11p3Tlwdq2) . | ||
|
||
This tutorial is supported by the Kavli Foundation and participation is free, but registration is required. By completing the registration you help us plan attendance and target the event to meet attendees interest. This registration is for the NWB:N tutorial only; to register for Cosyne 2019 please see the [Cosyne 2019 website](http://cosyne.org/c/index.php?title=Registration). | ||
|
||
|
||
## Tutorial Program | ||
|
||
* [NWB:N 2.0: Overview](https://drive.google.com/open?id=1Dq7zhQ4weiGv-3m6zD11ZrHQcObuddik) | ||
* [Electrophysiology tutorial slides](https://docs.google.com/presentation/d/1Q03wU6NzMTOwuWaZIANtldNT-8ZKeM0fk3b0mEI-JFc/edit?usp=sharing) | ||
* [python jupyter notebook](http://htmlpreview.github.io/?https://github.com/NeurodataWithoutBorders/nwb_hackathons/blob/master/Cosyne_2019/cosyne_NWB_tutorial_2019_python.html) | ||
* [matlab code](http://htmlpreview.github.io/?https://github.com/NeurodataWithoutBorders/nwb_hackathons/blob/master/Cosyne_2019/cosyne_NWB_tutorial_2019_matlab.html) | ||
|
||
## Organizing Committee | ||
|
||
* Ben Dichter | ||
* Oliver Ruebel, LBNL | ||
* Stephanie Albin, Kavli Foundation | ||
|
||
### Additional Organizational Support | ||
|
||
- The Kavli Foundation |
295 changes: 295 additions & 0 deletions
295
pr-preview/pr-322/Cosyne_2019/cosyne_NWB_tutorial_2019.ipynb
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,295 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"# Cosyne 2019 NWB:N Tutorial - Extracellular Electrophysiology" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Set up NWB file\n", | ||
"NWB files require a session start time to be entered with a timezone field." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from pynwb import NWBFile\n", | ||
"from datetime import datetime\n", | ||
"from dateutil import tz\n", | ||
"\n", | ||
"start_time = datetime(2018, 4, 25, 2, 30, 3, tzinfo=tz.gettz('US/Pacific'))\n", | ||
"\n", | ||
"nwbfile = NWBFile(identifier='Mouse5_Day3',\n", | ||
" session_description='mouse in open exploration and theta maze', # required\n", | ||
" session_start_time=start_time, # required\n", | ||
" experimenter='My Name', # optional\n", | ||
" session_id='session_id', # optional\n", | ||
" institution='University of My Institution', # optional\n", | ||
" lab='My Lab Name', # optional\n", | ||
" related_publications='DOI:10.1016/j.neuron.2016.12.011') # optional" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Subject info" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from pynwb.file import Subject\n", | ||
"\n", | ||
"nwbfile.subject = Subject(age='9 months', description='mouse 5',\n", | ||
" species='Mus musculus', sex='M')" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Position\n", | ||
"The `Position` object is a `MultiContainerInterface` that holds one or more `SpatialSeries` objects, which are a subclass of `TimeSeries`. Here, we put a `SpatialSeries` object called `'position'` in a `Position` object, and put that in a `ProcessingModule` named `'behavior'`.\n", | ||
"<img src=\"images/position.png\" width=\"800\">" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"import numpy as np\n", | ||
"from pynwb.behavior import SpatialSeries, Position\n", | ||
"\n", | ||
"position_data = np.array([\n", | ||
" np.linspace(0, 10, 100),\n", | ||
" np.linspace(1, 8, 100)]).T\n", | ||
"spatial_series_object = SpatialSeries(\n", | ||
" name='position', data=position_data,\n", | ||
" reference_frame='unknown',\n", | ||
" conversion=1.0, resolution=np.nan,\n", | ||
" timestamps=np.linspace(0, 100) / 200)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"pos_obj = Position(spatial_series=spatial_series_object)\n", | ||
"behavior_module = nwbfile.create_processing_module(\n", | ||
" name='behavior',\n", | ||
" description='data relevant to behavior')\n", | ||
"\n", | ||
"behavior_module.add_data_interface(pos_obj)\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Write to file" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from pynwb import NWBHDF5IO\n", | ||
"\n", | ||
"with NWBHDF5IO('test_ephys.nwb', 'w') as io:\n", | ||
" io.write(nwbfile)\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Electrodes table\n", | ||
"Extracellular electrodes are stored in a `electrodes`, which is a `DynamicTable`. `electrodes` has several required fields: x, y, z, impedence, location, filtering, and electrode_group. Here, we also demonstate how to add optional columns to a table by adding the `'label'` column.<img src=\"images/electrodes_table.png\" width=\"300\">" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"nwbfile.add_electrode_column('label', 'label of electrode')\n", | ||
"shank_channels = [4, 3]\n", | ||
"\n", | ||
"electrode_counter = 0\n", | ||
"device = nwbfile.create_device('implant')\n", | ||
"for shankn, nelecs in enumerate(shank_channels):\n", | ||
" electrode_group = nwbfile.create_electrode_group(\n", | ||
" name='shank{}'.format(shankn),\n", | ||
" description='electrode group for shank {}'.format(shankn),\n", | ||
" device=device,\n", | ||
" location='brain area')\n", | ||
" for ielec in range(nelecs):\n", | ||
" nwbfile.add_electrode(\n", | ||
" x=5.3, y=1.5, z=8.5, imp=np.nan,\n", | ||
" location='unknown', filtering='unknown',\n", | ||
" group=electrode_group,\n", | ||
" label='shank{}elec{}'.format(shankn, ielec))\n", | ||
" electrode_counter += 1\n", | ||
"\n", | ||
"all_table_region = nwbfile.create_electrode_table_region(\n", | ||
" list(range(electrode_counter)), 'all electrodes')" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## LFP\n", | ||
"`LFP` is another `MultiContainerInterface`. It holds one or more `ElectricalSeries` objects, which are `TimeSeries`. Here, we put an `ElectricalSeries` named `'lfp'` in an `LFP` object, in a `ProcessingModule` named `'ecephys'`.\n", | ||
"<img src=\"images/lfp.png\" width=\"800\">" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from pynwb.ecephys import ElectricalSeries, LFP\n", | ||
"lfp_data = np.random.randn(100, 7)\n", | ||
"ecephys_module = nwbfile.create_processing_module(\n", | ||
" name='ecephys',\n", | ||
" description='extracellular electrophysiology data')\n", | ||
"ecephys_module.add_data_interface(\n", | ||
"LFP(ElectricalSeries('lfp', lfp_data, all_table_region, \n", | ||
"rate=1000., resolution=.001, conversion=1.)))" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Spike Times\n", | ||
"Spike times are stored in another `DynamicTable` of subtype `Units`. The main `Units` table is at `/units` in the HDF5 file. You can add columns to the `Units` table just like you did for `electrodes`." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"for shankn, channels in enumerate(shank_channels):\n", | ||
" for n_units_per_shank in range(np.random.poisson(lam=5)):\n", | ||
" n_spikes = np.random.poisson(lam=10)\n", | ||
" spike_times = np.abs(np.random.randn(n_spikes))\n", | ||
" nwbfile.add_unit(spike_times=spike_times)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Trials\n", | ||
"Trials is another `DynamicTable` that lives an `/intervals/trials`." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"nwbfile.add_trial_column('correct', description='correct trial')\n", | ||
"nwbfile.add_trial(start_time=1.0, stop_time=5.0, correct=True)\n", | ||
"nwbfile.add_trial(start_time=6.0, stop_time=10.0, correct=False)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Write and read\n", | ||
"Data arrays are read passively from the file. That means `TimeSeries.data` does not read the entire data object, but presents an h5py object that can be indexed to read data. Index this array just like a numpy array to read only a specific section of the array, or use the `[:]` operator to read the entire thing." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from pynwb import NWBHDF5IO\n", | ||
"\n", | ||
"with NWBHDF5IO('test_ephys.nwb', 'w') as io:\n", | ||
" io.write(nwbfile)\n", | ||
"\n", | ||
"with NWBHDF5IO('test_ephys.nwb', 'r') as io:\n", | ||
" nwbfile2 = io.read()\n", | ||
"\n", | ||
" print(nwbfile2.modules['ecephys']['LFP'].electrical_series['lfp'].data[:])" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Accessing data regions\n", | ||
"You can easily read subsections of datasets" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"io = NWBHDF5IO('test_ephys.nwb', 'r')\n", | ||
"nwbfile2 = io.read()\n", | ||
"\n", | ||
"print('section of lfp:')\n", | ||
"print(nwbfile2.modules['ecephys']['LFP'].electrical_series['lfp'].data[:10,:5])\n", | ||
"print('')\n", | ||
"print('')\n", | ||
"print('spike times from first unit:')\n", | ||
"print(nwbfile2.units['spike_times'][0])\n", | ||
"io.close()" | ||
] | ||
} | ||
], | ||
"metadata": { | ||
"kernelspec": { | ||
"display_name": "Python 3", | ||
"language": "python", | ||
"name": "python3" | ||
}, | ||
"language_info": { | ||
"codemirror_mode": { | ||
"name": "ipython", | ||
"version": 3 | ||
}, | ||
"file_extension": ".py", | ||
"mimetype": "text/x-python", | ||
"name": "python", | ||
"nbconvert_exporter": "python", | ||
"pygments_lexer": "ipython3", | ||
"version": "3.6.8" | ||
} | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 2 | ||
} |
Oops, something went wrong.