You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In discussion with our potential users, we have realized that the installation of napari-iohub is a barrier. I suggest simplifying access as follows.
Build a docker image with napari-iohub, napari-animation, and few other utilities important for our analysis workflow.
Make the docker image available on our HPC such that it can be launched from CLI just like how we launch micro-manager. The workflow can look like:
* module load comp.micro.
* micro-manager and/or napari.
The text was updated successfully, but these errors were encountered:
For Micro-Manager we needed the containerization because the Java/C++ project does not provide Linux binaries and there's only build instructions for Ubuntu (while Bruno runs Rocky), so we are using Apptainer to run Ubuntu on top of Rocky.
Meanwhile, napari-iohub is a pure-Python package and does not have the same constraints. @JoOkuma maintains a shared Conda environment (imgproc which will show up after ml royerlab) that contains most of napari-iohub's dependency (and all of the required ones) and has been used by multiple internal users for similar purposes (not having to install napari and plugins themselves). I think we should either consolidate effort or at least try to replicate that simpler solution first, as setting up the Conda environment needs to be done regardless of the choice about containerization.
In discussion with our potential users, we have realized that the installation of
napari-iohub
is a barrier. I suggest simplifying access as follows.napari-iohub
,napari-animation
, and few other utilities important for our analysis workflow.*
module load comp.micro
.*
micro-manager
and/ornapari
.The text was updated successfully, but these errors were encountered: