Skip to content

Commit

Permalink
preparation instructions: small fixes and minor formatting
Browse files Browse the repository at this point in the history
  • Loading branch information
schlegelp committed Sep 16, 2024
1 parent 7017eb2 commit 0aff42f
Showing 1 changed file with 89 additions and 38 deletions.
127 changes: 89 additions & 38 deletions docs/preparing.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,17 +14,21 @@ Most of the workshop will be using [Jupyter](https://jupyter.org/) for interacti
## Prerequisites
Windows, MacOS, and Linux should all work. You can also run Python in a Jupyter notebook in the cloud, using Google Colab.

The tutorials are compatible with Python version >= 3.9. You should have a current version of pip (>= 23) and setuptools (>= 68)
The tutorials are compatible with Python version >= 3.9. You should have a current version of `pip` (>= 23) and `setuptools` (>= 68).

We will use the command `python3` to invoke Python on Linux, but on Windows the appropriate command may be `python`. You should find out what the appropriate command is for Python on your system.

Both pip and setuptools can be updated using pip itself with `python3 -m pip install --upgrade pip setuptools`.
Both `pip` and `setuptools` can be updated using pip itself with

In the tutorials, we use some software which is available from PyPI but is not available as a conda package. If you prefer to use conda, you will have to install some things in a conda environment
using a conda-managed version of pip.
```shell
python3 -m pip install --upgrade pip setuptools
```

In the tutorials, we use some software which is available from PyPI but not as a `conda` package. If you prefer to use `conda`, you will have to install some things in a `conda` environment
using a `conda`-managed version of `pip`.

In order to avoid dependency conflicts with other Python packages on your device,
including packages managed by the system package manager rather than pip or conda, you should create a virtual environment to install
including packages managed by the system package manager rather than `pip` or `conda`, you should create a virtual environment to install
all Python packages that will be used in the workshop. Start by navigating to the directory in which you plan to work, and follow the instructions appropriate for your package manager.

Alternatively, if you plan to use Google Colab for everything, you can skip to that section.
Expand Down Expand Up @@ -71,60 +75,107 @@ The workshop will focus on four main packages.

In addition to these core tools, we will rely on some other software:

- Scanpy, a Python package for the analysis of single-cell gene expression data
- Jupyter, an interactive environment for evaluating Python code and plotting graphs
- ipywidgets, which extends the functionality of Jupyter (e.g., progress bars)
- leidenalg, an implementation of the Leiden clustering algorithm
- umap-learn, which is useful for visualizing the clustering structure of high-dimensional data
- Pandas, a dataframe library for manipulating and filtering tabular data
- Plotly, a rich visualization library
- `scanpy`, a Python package for the analysis of single-cell gene expression data
- `Jupyter`, an interactive environment for evaluating Python code and plotting graphs
- `ipywidgets`, which extends the functionality of Jupyter (e.g., progress bars)
- `leidenalg`, an implementation of the Leiden clustering algorithm
- `umap-learn`, which is useful for visualizing the clustering structure of high-dimensional data
- `pandas`, a dataframe library for manipulating and filtering tabular data
- `plotly`, a rich visualization library

## Installing dependencies with Pip
Pip is the standard package manager for Python.
### Installing dependencies with Pip
`Pip` is the standard package manager for Python. We recommend you use a virtual environment which you can create like so:

```
```shell
python3 -m venv ./neuro_workshop
```

After creating the virtual environment, you should activate it, which sets certain environment variables and path variables to
ensure that Python commands and modules refer to code in the ./neuro_workshop folder. In Windows Powershell, the command is `.\neuro_workshop\Scripts\activate.ps1`.
In Linux, the command is `source ./neuro_workshop/bin/activate`. The section "How venvs work" of the [Python venv documentation](https://docs.python.org/3/library/venv.html)
contains a table containing the appropriate command indexed by operating system and shell. You can deactivate the virtual environment with the command `deactivate`.
After creating the virtual environment, you should activate it, which sets certain environment variables and path variables to
ensure that Python commands and modules refer to code in the ./neuro_workshop folder. The way this is done depends on your operating system:

=== "Windows PowerShell"

In Windows Powershell, the command is:

```
.\neuro_workshop\Scripts\activate.ps1
```

=== "Linux and MacOS"

For Linux and MacOS, the command is

Before running the command, please check the left hand side of your terminal prompt to ensure that the virtual environment is activated.
```shell
source ./neuro_workshop/bin/activate
```

You can tap out of the virtual environment with this command:

```shell
deactivate
```

!!! info "Help?"
The section "How venvs work" of the [Python venv documentation](https://docs.python.org/3/library/venv.html)
contains a table containing the appropriate command indexed by operating system and shell.


Before continuing, check the left-hand side of your terminal prompt to ensure that the virtual environment is activated!
Then run this to install all required dependencies in one go:

```shell
python3 -m pip install jupyterlab ipywidgets cajal scanpy leidenalg navis umap-learn pandas plotly pynapple nemos matplotlib requests
```
## Installing dependencies with Conda
Conda is a package manager commonly used in data science and bioinformatics. We do not provide a conda package for our tools; these instructions discuss how to use conda to create a sandbox for pip.
Instructions for workin with virtual environments in conda can be found [here](https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html).
You can open a conda terminal from the Windows start menu by launching the program "Anaconda Powershell Prompt."

Create and activate a virtual environment with
```
You should be good to go! Try firing up Jupyter lab and import e.g. `navis`!

### Installing dependencies with Conda
`Conda` is a package manager commonly used in data science and bioinformatics. We do not provide a `conda` package for our tools; these instructions discuss how to use `conda` to create a sandbox for `pip`.
Instructions for working with virtual environments in `conda` can be found [here](https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html).
You can open a `conda` terminal from the Windows start menu by launching the program "Anaconda Powershell Prompt."

Create and activate a virtual environment with:

```shell
conda create --prefix ./neuro_workshop python=3.11
conda activate ./neuro_workshop
```

Install dependencies as follows:
```

```shell
conda config --append channels conda-forge
conda install numpy scipy jupyterlab ipywidgets scanpy leidenalg umap-learn pandas plotly
conda install numpy scipy jupyterlab ipywidgets scanpy leidenalg umap-learn pandas plotly
```

Now run
```
pip install cajal navis pynnapple nemos
Now run this:

```shell
pip install cajal navis[all] pynapple nemos
```
and pip should install the relevant dependencies into the ./neuro_workshop directory you have just created.

## Google Colab
Google Colab is a cloud computing service which can execute Jupyter notebooks stored in Google Drive. To use Google Colab for this tutorial, go to https://colab.research.google.com/ and click "File > New Notebook in Drive"
and `pip` should install the relevant dependencies into the ./neuro_workshop directory you have just created. Try firing up Jupyter lab and import e.g. `cajal`!

Install Python dependencies using Jupyter's pip magic command:
```
%pip install cajal scanpy leidenalg navis umap-learn pandas plotly pynapple nemos matplotlib requests
### Google Colab
Google Colab is a cloud computing service which can execute Jupyter notebooks stored in Google Drive. To use Google Colab for this tutorial, go to <https://colab.research.google.com/> and click `File` > `New Notebook in Drive`

Install the Python dependencies using `!` to run a shell command from a Jupyter cell:

```shell
!pip install cajal scanpy leidenalg navis umap-learn pandas plotly pynapple nemos matplotlib requests
```

!!! Important

When starting a Google Colab runtime, some libraries such as `matplotlib` will have already been loaded in the background. To make
sure that any updates installed via our call to `!pip install` actually take effect, you should click `Runtime` -> "Restart Session"
to restart the Python kernel.

## Getting Help
The workshop organizers can be contacted for installation help in the [Slack group](https://join.slack.com/t/pythontoolsfo-ehx1178/shared_invite/zt-2qjzd1c44-NZ~9kt0~kh47X6t80tK8Mg) for the workshop.
The workshop organizers can be contacted for installation help in the [Slack group](https://join.slack.com/t/pythontoolsfo-ehx1178/shared_invite/zt-2qjzd1c44-NZ~9kt0~kh47X6t80tK8Mg) for the workshop or using the [Discussions](https://github.com/navis-org/neuropython2024/discussions) in this website's Github repository.


!!! Question "What about data?!"

We're still putting the finishing touches on the exercises we want to run with you!
Once that dust has settled, we will share the data artefacts you'll need to follow along during the course.

0 comments on commit 0aff42f

Please sign in to comment.