You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In my opinion, Neuroconv doesn't offer an easy abstraction that would help you speed up your process. Let me explain.
Neuroconv works well for building interfaces that automatically extract data from common and stable formats. If the format or convention in MATLAB is stable and widely used, it makes sense to build an interface in Neuroconv so other users can benefit. A good example is the CellExplorer format by György Buzsáki lab's which I describe in more detail below. However, this is not commonly the case in my experience. MATLAB conventions tend to be neither popular nor stable enough to justify the effort of writing an interface. In most of our conversions that involve interacting with MATLAB code, we write conversion-specific code to achieve this ( Here's a recent example.)
Regarding the choice between Python and MATLAB, I think it largely depends on who is doing the conversion. If the person is more proficient in MATLAB, then it might be a good choice to write the conversion code in MATLAB. Neuroconv is a Python library, and most of us have expertise in Python, so we do our conversions in Python. Another consideration is that Python support is generally better, so if all else is equal, I would opt for Python.
Those are the trade-offs I would consider.
To shed further light on what the effort with Neuroconv would look like, I can describe how the code to extract spike timings for the CellExplorer Buzsáki lab format works.
The Neuroconv way of doing this would be to create an interface to extract the data. You can see a good example in the CellExplorerInterface that works like this:
A sorting extractor in SpikeInterface reads the data from the MATLAB convention as a sorting extractor object.
The interface in Neuroconv uses the sorting object from SpikeInterface and also reads the metadata from the MATLAB code. They work together to extract the data and export it to NWB. In that case, Neuroconv does have facilities to route the data to the correct types into NWB.
But you would need to build objects like the ones in the links.
I hope this was useful in giving you a better idea of the effort involved in building an interface versus using custom code. Feel free to ask any more questions or reach out to me on Slack for a short discussion if you need
The convention described at https://mvdmlab.host.dartmouth.edu/wiki2/doku.php?id=analysis:nsb2017:week2#introduction_to_neural_data_types . Primarily it is already spike sorted data (spike timings), LFP (continuous), events into .nwb files. Should we code directly in pynwb? or matnwb? or there is some help from neuroconv abstactions to somehow just "route" data sources and convert some metadata types readily?
Thank you for the guidance.
The text was updated successfully, but these errors were encountered: