diff --git a/jupyter-notebooks/trainings/Basemaps/Basemap_streaming.ipynb b/jupyter-notebooks/trainings/Basemaps/Basemap_streaming.ipynb
new file mode 100644
index 0000000..50a102d
--- /dev/null
+++ b/jupyter-notebooks/trainings/Basemaps/Basemap_streaming.ipynb
@@ -0,0 +1,302 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "id": "f94068e6-f0a2-40cf-ab96-bde308876141",
+ "metadata": {},
+ "source": [
+ "## Overview ##\n",
+ "---\n",
+ "In this notebook, you will learn how to order a [Planet Basemap](https://developers.planet.com/docs/data/visual-basemaps/) using your [Area of Interest](https://developers.planet.com/apis/orders/basemaps/#order-basemaps-by-area-of-interest-aoi) (AOI) and a [Quad ID](https://developers.planet.com/apis/orders/basemaps/#order-basemaps-by-quad-ids-and-deliver-to-cloud). We will place this order via Planet's [Orders API](https://developers.planet.com/apis/orders/) using our Planet Python [SDK](https://planet-sdk-for-python-v2.readthedocs.io/en/latest/python/sdk-guide/). \n",
+ "\n",
+ "1. Get a Basemap ID using either [Planet Explorer](https://developers.planet.com/docs/apps/explorer/), the [Basemap Viewer](https://www.planet.com/basemaps/#/mosaic/), or by API request. \n",
+ "2. Create a JSON order packet with order parameters.\n",
+ "3. Set up a session with the Planet SDK, place the order, and download it.\n",
+ "4. Repeat steps 1-3 using quad IDs. "
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "a89cdc6b-2857-49c3-81e2-af22cf92806b",
+ "metadata": {},
+ "source": [
+ "### Import Planet and Related Packages\n",
+ "\n",
+ "---\n",
+ "\n",
+ "Make sure you have Planet's Python SDK properly downloaded. You can find out more about this [here](https://developers.planet.com/docs/pythonclient/). Find your [API key](https://developers.planet.com/quickstart/apis/).\n",
+ "\n",
+ "Next set up a session by importing needed Python packages, pulling in your API Key, and make an initial request (to retrieve the Orders API parameters) to confirm a connection with the server."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "fcfc778a-857f-43ce-ac1c-4a17fd657c08",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import planet\n",
+ "import os\n",
+ "import json\n",
+ "import requests\n",
+ "from requests.auth import HTTPBasicAuth"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "5dc58e0c-ee2f-43e8-b8fc-9c06d225d8f5",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "PL_API_KEY = os.environ.get('PL_API_KEY')\n",
+ "auth = planet.Auth.from_key(PL_API_KEY)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "16439817-7825-487c-a3a4-69b039fb53b1",
+ "metadata": {},
+ "source": [
+ "### Get your basemap names"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "75c34420-be54-4760-9f32-20c2445ca2d7",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "BASEMAP_API_URL = 'https://api.planet.com/basemaps/v1/mosaics'\n",
+ "auth = HTTPBasicAuth(PL_API_KEY, '')\n",
+ "basemapServiceResponse = requests.get(url=BASEMAP_API_URL, auth=auth)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "2c1611dd-3fd6-44ca-9e7d-14f1215ca77f",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "basemaps = basemapServiceResponse.raise_for_status()\n",
+ "if basemapServiceResponse.status_code != 204:\n",
+ " basemaps = json.loads(basemapServiceResponse.text)\n",
+ "mosaics = basemaps['mosaics']\n",
+ "names = [feature['name'] for feature in mosaics]\n",
+ "print(names)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "82755c4f",
+ "metadata": {},
+ "source": [
+ "Iterate through all your available basemaps and extract all the global monthly from 2020."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "b688d7d2-39f3-42fe-a824-ef8a9d9b9146",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "global_monthly = []\n",
+ "cont = True\n",
+ "while cont:\n",
+ " for name in names:\n",
+ " #Here I am filtering for the global monthly basemap\n",
+ " if \"2020\" in name and \"global_monthly\" in name:\n",
+ " global_monthly.append(name)\n",
+ " next_url = basemaps['_links']['_next']\n",
+ " basemapServiceResponse = requests.get(url=next_url, auth=auth)\n",
+ " basemaps = json.loads(basemapServiceResponse.text)\n",
+ " mosaics = basemaps['mosaics']\n",
+ " names = [feature['name'] for feature in mosaics]\n",
+ " if \"_next\" not in basemaps[\"_links\"]:\n",
+ " cont = False"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "6b6bdf96-0aa7-44d2-8651-22b1b336fb2f",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "global_monthly"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "58d28178-2a25-4951-80fd-3a944dc9d4e8",
+ "metadata": {
+ "tags": []
+ },
+ "source": [
+ "### Create an order packet\n",
+ "---\n",
+ "Package up the details of your order in a [JSON object](https://developers.planet.com/apis/orders/basemaps/#example-order-query-json-block) and make a POST request, passing in the Orders URL, your JSON, your API key, and the content-type header. We are [delivering](https://developers.planet.com/apis/orders/delivery/) this order to a Google Cloud Storage bucket. You can see examples of using tools [here](https://developers.planet.com/apis/orders/tools/). Make sure to replace the mosaic name and coordinates with your specifications."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "49288bd9-47e8-4865-bd5c-1c3c815a30e2",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "BASEMAP_API_URL = 'https://api.planet.com/basemaps/v1/mosaics'"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "d90799e2-11da-4f52-a3b3-e80f15c28d0a",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "order_params = {\n",
+ " \"name\": \"basemap order with geometry\",\n",
+ " \"source_type\": \"basemaps\",\n",
+ " \"order_type\": \"partial\",\n",
+ " \"products\": [\n",
+ " {\n",
+ " \"mosaic_name\": global_monthly[0],\n",
+ " \"geometry\": {\n",
+ " \"type\": \"Polygon\",\n",
+ " \"coordinates\": [\n",
+ " [\n",
+ " [4.607406, 52.353994],\n",
+ " [4.680005, 52.353994],\n",
+ " [4.680005, 52.395523],\n",
+ " [4.607406, 52.395523],\n",
+ " [4.607406, 52.353994]\n",
+ " ]\n",
+ " ]\n",
+ " }\n",
+ " }\n",
+ " ],\n",
+ " \"tools\": [{\"clip\": {}}]\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "b44869b6-0e7e-4fdc-8852-508ba2074bdc",
+ "metadata": {},
+ "source": [
+ "### Create a session with SDK and poll for success\n",
+ "\n",
+ "Here, we are creating an order using the SDK. Then we are waiting for the order to be delievered to our GCP storage bucket. Note that this can take some time."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "b1bf7890-6c51-470e-bdfc-5efb781414da",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "async def create_and_deliver_order(order_params, client):\n",
+ " '''Create an order and wait for it to delivered\n",
+ "\n",
+ " Parameters:\n",
+ " order_params: An order request\n",
+ " client: An Order client object\n",
+ " '''\n",
+ " with planet.reporting.StateBar(state='creating') as reporter:\n",
+ " # Place an order to the Orders API\n",
+ " order = await client.create_order(order_params)\n",
+ " reporter.update(state='created', order_id=order['id'])\n",
+ " # Wait while the order is being completed\n",
+ " await client.wait(order['id'], callback=reporter.update_state,\n",
+ " max_attempts=0)\n",
+ " # if we get here that means the order completed. Yay! Download the files.\n",
+ " await client.download_order(order['id'])"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "45c71902-c64a-4ad8-841d-4e4bb9c14915",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "async with planet.Session() as ps:\n",
+ " # The Orders API client\n",
+ " client = ps.client('orders')\n",
+ " # Create the order and deliver it to GCP\n",
+ " await create_and_deliver_order(order_params, client)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "36e4700a-87fb-4dc5-882d-0939b656b880",
+ "metadata": {},
+ "source": [
+ "## Creating a streaming link for scenes\n",
+ "\n",
+ "This quick demo shows you how to use the tiles API to create WMTS links to stream into you GIS"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "90326987-2890-4388-822a-3b0a7bb32fdb",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "tile_url = 'https://tiles0.planet.com/data/v1/layers'\n",
+ "#Copy the list of scene IDs from the API dialog into the ids JSON\n",
+ "data = {\n",
+ " \"ids\": [\n",
+ " \"PSScene:20230813_184925_91_2475\"\n",
+ " ]\n",
+ "}\n",
+ "api_key = os.getenv('PL_API_KEY')"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "id": "8425c5b9-935f-492f-b052-0814f9830901",
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "res = requests.post(tile_url, auth=(api_key, \"\"), data=data)\n",
+ "print(res)\n",
+ "print(res.json())\n",
+ "name = res.json()[\"name\"]\n",
+ "url = \"{}/wmts/{}?api_key={}\".format(tile_url,name, api_key)\n",
+ "\n",
+ "# This URL includes your API key, be careful where you share it!\n",
+ "print(url)"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 3 (ipykernel)",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.12.3"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 5
+}
diff --git a/jupyter-notebooks/trainings/Data-Orders-SDK/planet_python_client_introduction.ipynb b/jupyter-notebooks/trainings/Data-Orders-SDK/planet_python_client_introduction.ipynb
new file mode 100644
index 0000000..585471e
--- /dev/null
+++ b/jupyter-notebooks/trainings/Data-Orders-SDK/planet_python_client_introduction.ipynb
@@ -0,0 +1,800 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Planet API Python Client\n"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "This tutorial is an introduction to [Planet](https://www.planet.com)'s Data and Orders API using the official [Python client](https://github.com/planetlabs/planet-client-python), the `planet` module. It shows you how to filter for clouds based on a specific AOI.\n",
+ "\n",
+ "## Requirements\n",
+ "\n",
+ "This tutorial assumes familiarity with the [Python](https://python.org) programming language throughout. Python modules used in this tutorial are:\n",
+ "* [IPython](https://ipython.org/) and [Jupyter](https://jupyter.org/)\n",
+ "* [planet](https://github.com/planetlabs/planet-client-python)\n",
+ "* [geojsonio](https://pypi.python.org/pypi/geojsonio)\n",
+ "* [rasterio](https://rasterio.readthedocs.io/en/latest/index.html)\n",
+ "* [shapely](https://shapely.readthedocs.io/en/stable/index.html)\n",
+ "* [asyncio](https://docs.python.org/3/library/asyncio.html)\n",
+ "\n",
+ "You should also have an account on the Planet Platform and retrieve your API key from your [account page](https://www.planet.com/account/).\n",
+ "\n",
+ "## Useful links \n",
+ "* [Planet Client V2 Documentation](https://github.com/planetlabs/planet-client-python)\n",
+ "* [Planet Data API reference](https://developers.planet.com/docs/apis/data/)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "This tutorial will cover the basic operations possible with the Python client, particularly those that interact with the Data API and Orders API"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Set up\n",
+ "\n",
+ "In order to interact with the Planet API using the client, we need to import the necessary packages & define helper functions."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "#general packages\n",
+ "import os\n",
+ "import json\n",
+ "import asyncio\n",
+ "import rasterio\n",
+ "import numpy as np\n",
+ "import nest_asyncio\n",
+ "from datetime import datetime\n",
+ "\n",
+ "#geospatial packages\n",
+ "import geopandas as gpd\n",
+ "from rasterio.mask import mask\n",
+ "from shapely.geometry import shape\n",
+ "from shapely.ops import unary_union\n",
+ "from shapely.geometry import mapping\n",
+ "\n",
+ "#planet SDK\n",
+ "from planet import Auth\n",
+ "from planet import Session, data_filter\n",
+ "\n",
+ "\n",
+ "# We will also create a small helper function to print out JSON with proper indentation.\n",
+ "def indent(data):\n",
+ " print(json.dumps(data, indent=2))"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "We next need to create a `client` object registered with our API key. The API key will be automatically read from the `PL_API_KEY` environment variable if it exists. If not, you can provide it below. You can also authenticate via the CLI using [`auth init`](https://planet-sdk-for-python-v2.readthedocs.io/en/latest/cli/cli-reference/?h=auth#auth:~:text=message%20and%20exit.-,auth,-%C2%B6), this will store your API key as an environment variable."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# if your Planet API Key is not set as an environment variable, you can paste it below\n",
+ "if 'PL_API_KEY' in os.environ:\n",
+ " API_KEY = os.environ['PL_API_KEY']\n",
+ "else:\n",
+ " API_KEY = 'PASTE_API_KEY_HERE'\n",
+ " os.environ['PL_API_KEY'] = API_KEY\n",
+ "\n",
+ "client = Auth.from_key(API_KEY)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Searching\n",
+ "\n",
+ "We can search for items that are interesting by using the `quick_search` member function. Searches, however, always require a proper request that includes a filter that selects the specific items to return as seach results."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's also read in a GeoJSON geometry into a variable so we can use it during testing. The geometry can only have one polygon to work with the data API"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "with open(\"sf_all.geojson\") as f:\n",
+ " geom_all = json.loads(f.read())\n"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Filters"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "The possible filters include `and_filter`, `date_range_filter`, `range_filter` and so on, mirroring the options supported by the Planet API. Additional filters are described [here](https://planet-sdk-for-python-v2.readthedocs.io/en/latest/python/sdk-guide/#filter:~:text=(main())-,Filter,-%C2%B6)."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# Define the filters we'll use to find our data\n",
+ "\n",
+ "item_types = [\"PSScene\"]\n",
+ "\n",
+ "#Geometry filter\n",
+ "geom_filter = data_filter.geometry_filter(geom_all)\n",
+ "\n",
+ "#Date range filter\n",
+ "date_range_filter = data_filter.date_range_filter(\"acquired\", gt = datetime(month=5, day=1, year=2023))#, lt = datetime(month=2, day=1, year=2023))\n",
+ "\n",
+ "#Cloud cover filter\n",
+ "clear_percent_filter = data_filter.range_filter('clear_percent', 0,None)\n",
+ "\n",
+ "#Combine all of the filters\n",
+ "combined_filter = data_filter.and_filter([geom_filter, clear_percent_filter, date_range_filter])"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "scrolled": true
+ },
+ "outputs": [],
+ "source": [
+ "combined_filter"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now let's build the request:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "async with Session() as sess:\n",
+ " cl = sess.client('data')\n",
+ " request = await cl.create_search(name='planet_client_demo',search_filter=combined_filter, item_types=item_types)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "request"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# Search the Data API\n",
+ "\n",
+ "# The limit paramter allows us to limit the number of results from our search that are returned.\n",
+ "# The default limit is 100. Here, we're setting our result limit to 50.\n",
+ "async with Session() as sess:\n",
+ " cl = sess.client('data')\n",
+ " items = cl.run_search(search_id=request['id'], limit=1000)\n",
+ " item_list = [i async for i in items]"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "print(len(item_list))"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "If the number of items requested is more than 250, the client will automatically fetch more pages of results in order to get the exact number requested.\n",
+ "\n",
+ "Then we can save the output to be visualized as a geojson"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now, we can iterate through our search results."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "scrolled": true
+ },
+ "outputs": [],
+ "source": [
+ "for item in item_list:\n",
+ " print(item['id'], item['properties']['clear_percent'])"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "geoms = {\n",
+ " \"type\": \"FeatureCollection\",\n",
+ " \"features\": []\n",
+ "}\n",
+ "\n",
+ "if not os.path.isdir('output'):\n",
+ " os.mkdir('output')\n",
+ "else:\n",
+ " if os.path.isfile('output/results.geojson'):\n",
+ " os.remove('output/results.geojson')\n",
+ "\n",
+ "with open('output/results.geojson','w') as f:\n",
+ " for item in item_list:\n",
+ " geom_out = {\n",
+ " \"type\": \"Feature\",\n",
+ " \"properties\": {},\n",
+ " \"geometry\": item['geometry']\n",
+ " }\n",
+ " geoms['features'].append(geom_out)\n",
+ " jsonStr = json.dumps(geoms)\n",
+ " f.write(jsonStr)\n",
+ " f.close()"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now we can import our multiple geometries"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "with open(\"sf_84.geojson\") as f:\n",
+ " geom_84 = json.loads(f.read())\n",
+ "geom_84 = geom_84['features']"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "A function that takes the geometry of the scenes and compares them with the AOIs in order to measuer coverage"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "#Calculate the area of overlap between two geometries\n",
+ "def get_overlap(geometry1, geometry2):\n",
+ " # Parse the JSON into geometry objects.\n",
+ " shape1 = unary_union([shape(geom_1['geometry']) for geom_1 in geometry1])\n",
+ " shape2 = shape(geometry2)\n",
+ "\n",
+ " # Compute the intersection of the two geometries.\n",
+ " intersection = shape1.intersection(shape2)\n",
+ "\n",
+ " # Compute the areas of the geometries and their intersection.\n",
+ " area1 = shape1.area\n",
+ " area2 = shape2.area\n",
+ " intersection_area = intersection.area\n",
+ "\n",
+ " # Compute the overlap as a percentage of the total area.\n",
+ " if intersection_area == 0:\n",
+ " return 0\n",
+ " overlap = intersection_area / area1 * 100\n",
+ "\n",
+ " return overlap"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Use the filter function in order to get 100% coverage over your two AOIs"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "#recreate a geometry\n",
+ "geoms = {\n",
+ " \"type\": \"FeatureCollection\",\n",
+ " \"features\": []\n",
+ "}\n",
+ "\n",
+ "#make a new list of IDs\n",
+ "covered_list = []\n",
+ "\n",
+ "\n",
+ "with open('output/results_coverage.geojson','w') as f:\n",
+ " for item in item_list:\n",
+ " overlap = get_overlap(geom_84, item['geometry'])\n",
+ " print(overlap)\n",
+ " if overlap >= 100:\n",
+ " scene = {\n",
+ " \"type\": \"Feature\",\n",
+ " \"properties\": {},\n",
+ " \"geometry\": item['geometry']\n",
+ " }\n",
+ " geoms['features'].append(scene)\n",
+ " covered_list.append(item)\n",
+ " jsonStr = json.dumps(geoms)\n",
+ " f.write(jsonStr)\n",
+ " f.close()"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Comparing the number of items overall compared to the ones that cover the entirety of our two AOIs"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "print(len(item_list))\n",
+ "print(len(covered_list))"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "This GeoJSON file can be opened and viewed in any compatible application."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Assets and downloads\n",
+ "\n",
+ "After a search returns results, the Python client can be used to check for assets and initiate downloads. Let's start by looking at one item and the assets available to download for that item.\n",
+ "\n",
+ "For more information on Items and Assets, check out [Items & Assets](https://developers.planet.com/docs/apis/data/items-assets/) on the Planet Developer Center."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# As an example, let's look at the first result in our item_list and grab the item_id and item_type:\n",
+ "item = covered_list[0]\n",
+ "print(indent(item))\n",
+ "print(item['id'], item['properties']['cloud_percent'])"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "There are a few steps involved in order to download an asset using the Planet Python Client:\n",
+ "\n",
+ "* **Get Asset:** Get a description of our asset based on the specifications we're looking for\n",
+ "* **Activate Asset:** Activate the asset with the given description\n",
+ "* **Wait Asset:** Wait for the asset to be activated\n",
+ "* **Download Asset:** Now our asset is ready for download!\n",
+ "\n",
+ "Let's go through these steps below. We'll do this for our cloud asset the ortho_udm2."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "nest_asyncio.apply()\n",
+ "\n",
+ "async def download_cloud(item):\n",
+ " async with Session() as sess:\n",
+ " cl = sess.client('data')\n",
+ " # Get Asset\n",
+ " asset_desc = await cl.get_asset(item_type_id=item['properties']['item_type'],item_id=item['id'], asset_type_id='ortho_udm2')\n",
+ " # Activate Asset\n",
+ " await cl.activate_asset(asset=asset_desc)\n",
+ " # Wait Asset\n",
+ " await cl.wait_asset(asset=asset_desc)\n",
+ " # Download Asset\n",
+ " for i in range(5): # retry 3 times\n",
+ " try:\n",
+ " asset_path = await cl.download_asset(asset=asset_desc, directory='cloud_output', overwrite=True)\n",
+ " return asset_path\n",
+ " except Exception as e:\n",
+ " print(f\"Attempt {i+1} failed with error: {e}\")\n",
+ " if i < 4: # if not the last attempt\n",
+ " await asyncio.sleep(5) # wait for 5 seconds before retrying\n",
+ " else:\n",
+ " raise # re-raise the last exception if all attempts fail\n",
+ "\n",
+ "\n",
+ " asset_path = await cl.download_asset(asset=asset_desc, directory='cloud_output', overwrite=True)\n",
+ " return asset_path"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "await download_cloud(item)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "A function that takes in a geometry and a cloud masks and outputs the percent of clear imagery within the AOI"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def calculate_mask_coverage(file_path, geometry):\n",
+ " # Convert geometry to GeoDataFrame\n",
+ " gdf = gpd.GeoDataFrame([1], geometry=[geometry], crs='32610')\n",
+ "\n",
+ " # Open the geotiff file\n",
+ " with rasterio.open(file_path) as src:\n",
+ " # Transform geometry to raster CRS\n",
+ " gdf = gdf.to_crs(src.crs)\n",
+ "\n",
+ " # Mask the geotiff with the geometry\n",
+ " out_image, out_transform = mask(src, [mapping(gdf.geometry.values[0])], crop=True, filled=False)\n",
+ "\n",
+ " # Band 1 is the mask layer\n",
+ " mask_band = out_image[0]\n",
+ "\n",
+ " # Convert the masked array to a regular numpy array and set a specific value for the masked pixels\n",
+ " mask_band = np.where(mask_band.mask, -1, mask_band)\n",
+ " \n",
+ " # Calculate the total number of pixels\n",
+ " total_pixels = np.sum(mask_band >= 0) # Only count pixels with value 0 or 1\n",
+ "\n",
+ " # Calculate the number of 1s (True) and 0s (False)\n",
+ " unique, counts = np.unique(mask_band[mask_band >= 0], return_counts=True)\n",
+ " counts_dict = dict(zip(unique, counts))\n",
+ "\n",
+ " # Calculate the percentages\n",
+ " percent_ones = (counts_dict.get(1, 0) / total_pixels) * 100\n",
+ " percent_zeros = (counts_dict.get(0, 0) / total_pixels) * 100\n",
+ "\n",
+ " return percent_ones\n"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "We need a geojson that has the same CRS as the images which in this case is UDM 10N"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "with open(\"sf_UTM.geojson\") as f:\n",
+ " geom_utm = json.loads(f.read())\n",
+ "geom_utm = geom_utm['features']"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "calculate_mask_coverage(\"cloud_output/20240616_190907_10_2477_3B_udm2.tif\",shape(geom_utm[0]['geometry']))"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now we run the function to download all the UDMs asyncronously. The output is rather busy so it is being stored in the captured variable"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "%%capture captured\n",
+ "\n",
+ "nest_asyncio.apply()\n",
+ "\n",
+ "async with Session() as sess:\n",
+ " tasks = [download_cloud(item) for item in covered_list]\n",
+ " await asyncio.gather(*tasks)\n",
+ "\n",
+ " \n",
+ "print(captured.stderr)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "This reads all of the cloud tiff files and creates a list of cloud cover over each AOI as well as a dictionary for each with scene ID and cloud values"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import glob\n",
+ "\n",
+ "cloud_tifs = glob.glob(\"cloud_output/*\")\n",
+ "sunset = []\n",
+ "mission = []\n",
+ "sunset_free = {}\n",
+ "mission_free = {}\n",
+ "for cloud in cloud_tifs:\n",
+ " sunset_free[cloud] = calculate_mask_coverage(cloud,shape(geom_utm[1]['geometry']))\n",
+ " mission_free[cloud] = calculate_mask_coverage(cloud,shape(geom_utm[0]['geometry']))\n",
+ " sunset.append(sunset_free[cloud])\n",
+ " mission.append(mission_free[cloud])\n",
+ " "
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Cloud Percent\n",
+ "\n",
+ "Now you print out the cloud cover of each AOI"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "print(np.mean(sunset))\n",
+ "print(np.mean(mission))"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Ordering"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "In this example, we will order a PSScene ortho_visual image. For variations on this kind of order, see Ordering Data.\n",
+ "\n",
+ "In this order, we request a visual bundle. A bundle is a group of assets for an item. See the Scenes Product Bundles Reference to learn about other bundles and other items."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Place Order\n",
+ "Create the order structure using `planet` functions"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "from planet import order_request\n",
+ "\n",
+ "\n",
+ "async def assemble_order(item_ids):\n",
+ " products = [\n",
+ " order_request.product(item_ids, 'visual', 'PSScene')\n",
+ " ]\n",
+ "\n",
+ " tools = [order_request.clip_tool(aoi=geom_all)]\n",
+ "\n",
+ " request = order_request.build_request(\n",
+ " 'test_order_sdk_method_2', products=products, tools=tools)\n",
+ " return request\n",
+ " \n",
+ "request = await assemble_order(['20240615_190730_26_24d7'])"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Having created the order we can now place it and await it for download"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "from planet import reporting, Session, OrdersClient\n",
+ "\n",
+ "\n",
+ "# an async Orders client to request order creation\n",
+ "async with Session() as sess:\n",
+ " cl = OrdersClient(sess)\n",
+ " with reporting.StateBar(state='creating') as bar:\n",
+ " # create order via Orders client\n",
+ " order = await cl.create_order(request)\n",
+ " bar.update(state='created', order_id=order['id'])\n",
+ "\n",
+ " # poll...poll...poll...\n",
+ " await cl.wait(order['id'], callback=bar.update_state)\n",
+ "\n",
+ " # if we get here that means the order completed. Yay! Download the files.\n",
+ " await cl.download_order(order['id'])"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now lets put it in as a function"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "async def do_order(order):\n",
+ " async with Session() as sess:\n",
+ " cl = OrdersClient(sess)\n",
+ " with reporting.StateBar(state='creating') as bar:\n",
+ " order = await cl.create_order(order)\n",
+ " bar.update(state='created', order_id=order['id'])\n",
+ "\n",
+ " await cl.wait(order['id'], callback=bar.update_state)\n",
+ "\n",
+ " # if we get here that means the order completed. Yay! Download the files.\n",
+ " await cl.download_order(order['id'])\n"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now we can order all the scenes at once"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "ids = []\n",
+ "for info in covered_list:\n",
+ " ids.append(info['id'])\n",
+ "\n",
+ "\n",
+ "request = await assemble_order(ids)\n",
+ "print(request)\n",
+ "await do_order(request)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "nest_asyncio.apply()\n",
+ "\n",
+ "\n",
+ "#now all you need to do to have them run in parallel is to create an array of order requests\n",
+ "async with Session() as sess:\n",
+ " tasks = [do_order(o) for o in order_list]\n",
+ " await asyncio.gather(*tasks)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": []
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 3 (ipykernel)",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.12.3"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 4
+}
diff --git a/jupyter-notebooks/trainings/Data-Orders-SDK/sf_84.geojson b/jupyter-notebooks/trainings/Data-Orders-SDK/sf_84.geojson
new file mode 100644
index 0000000..246c6bb
--- /dev/null
+++ b/jupyter-notebooks/trainings/Data-Orders-SDK/sf_84.geojson
@@ -0,0 +1,9 @@
+{
+"type": "FeatureCollection",
+"name": "sf_84",
+"crs": { "type": "name", "properties": { "name": "urn:ogc:def:crs:OGC:1.3:CRS84" } },
+"features": [
+{ "type": "Feature", "properties": { }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -122.421537768042299, 37.769476410529691 ], [ -122.421537768042299, 37.746917437796313 ], [ -122.401447559588888, 37.746917437796306 ], [ -122.401447559588888, 37.769476410529691 ], [ -122.421537768042299, 37.769476410529691 ] ] ] } },
+{ "type": "Feature", "properties": { }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -122.510633475097777, 37.764182575560788 ], [ -122.510633475097777, 37.737477459270025 ], [ -122.490834429085552, 37.737477459270018 ], [ -122.490834429085552, 37.764182575560781 ], [ -122.510633475097777, 37.764182575560788 ] ] ] } }
+]
+}
diff --git a/jupyter-notebooks/trainings/Data-Orders-SDK/sf_UTM.geojson b/jupyter-notebooks/trainings/Data-Orders-SDK/sf_UTM.geojson
new file mode 100644
index 0000000..ed76e81
--- /dev/null
+++ b/jupyter-notebooks/trainings/Data-Orders-SDK/sf_UTM.geojson
@@ -0,0 +1,9 @@
+{
+"type": "FeatureCollection",
+"name": "sfd",
+"crs": { "type": "name", "properties": { "name": "urn:ogc:def:crs:EPSG::32610" } },
+"features": [
+{ "type": "Feature", "properties": { }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 550946.224831453175284, 4180395.972079975064844 ], [ 550961.698579900898039, 4177893.078008538112044 ], [ 552731.633347172872163, 4177904.207551079802215 ], [ 552715.622136365040205, 4180407.103900391142815 ], [ 550946.224831453175284, 4180395.972079975064844 ] ] ] } },
+{ "type": "Feature", "properties": { }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 543102.423176795127802, 4179763.840731464326382 ], [ 543117.91674571717158, 4176800.950970201287419 ], [ 544862.415313734090887, 4176810.255063764285296 ], [ 544846.294853025465272, 4179773.147081824485213 ], [ 543102.423176795127802, 4179763.840731464326382 ] ] ] } }
+]
+}
diff --git a/jupyter-notebooks/trainings/Data-Orders-SDK/sf_all.geojson b/jupyter-notebooks/trainings/Data-Orders-SDK/sf_all.geojson
new file mode 100644
index 0000000..b5f2d34
--- /dev/null
+++ b/jupyter-notebooks/trainings/Data-Orders-SDK/sf_all.geojson
@@ -0,0 +1 @@
+{"type":"FeatureCollection","features":[{"type":"Feature","properties":{},"geometry":{"coordinates":[[[-122.51515886076321,37.77473294917412],[-122.51515886076321,37.735199866713515],[-122.37649165396193,37.735199866713515],[-122.37649165396193,37.77473294917412],[-122.51515886076321,37.77473294917412]]],"type":"Polygon"}}]}
\ No newline at end of file
diff --git a/jupyter-notebooks/trainings/REST-APIs/REST_API_Intro.ipynb b/jupyter-notebooks/trainings/REST-APIs/REST_API_Intro.ipynb
new file mode 100644
index 0000000..d82656c
--- /dev/null
+++ b/jupyter-notebooks/trainings/REST-APIs/REST_API_Intro.ipynb
@@ -0,0 +1,605 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Intro to REST APIs\n",
+ "\n",
+ "This notebook demonstrates how to create requests and parse responses for the Data and Orders API with a quick code snippet at the end dedicated to the subscription API. We will create a search request for the Data API using a `Geojson`. We will parse the response for image IDs. We will use those image IDs to place a order request.\n",
+ "\n",
+ "\n",
+ "More reference information can be found at [Ordering & Delivery](https://developers.planet.com/apis/orders/)."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import json\n",
+ "import os\n",
+ "import pathlib\n",
+ "import time\n",
+ "\n",
+ "import requests\n",
+ "from requests.auth import HTTPBasicAuth"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Authenticating\n",
+ "You can also just paste your API key instead of the ```os.getenv('PL_API_KEY')```"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# API Key stored as an env variable\n",
+ "PLANET_API_KEY = os.getenv('PL_API_KEY')"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "orders_url = 'https://api.planet.com/compute/ops/orders/v2/'\n",
+ "data_url = \"https://api.planet.com/data/v1\""
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "auth = HTTPBasicAuth(PLANET_API_KEY, '')\n",
+ "response = requests.get(data_url, auth=auth)\n",
+ "response"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Searching with the Data API\n",
+ "We can use the [data API](https://developers.planet.com/docs/apis/data/) in order to automate searching based on the search criterias like: date range, cloud cover, area cover, aoi. We can create an AOI using https://geojson.io/"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def parse_geojson(filename):\n",
+ " features = json.load(open(filename))\n",
+ " if \"type\" in features and features[\"type\"] == \"FeatureCollection\":\n",
+ " for f in features[\"features\"]:\n",
+ " geoms = f[\"geometry\"]\n",
+ " elif \"type\" in features and features[\"type\"] == \"Feature\":\n",
+ " geoms= features[\"geometry\"]\n",
+ " else:\n",
+ " geoms = features\n",
+ "\n",
+ " return geoms"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "geometry = parse_geojson(\"sf.geojson\")\n",
+ "geometry"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# get images that overlap with our AOI \n",
+ "geometry_filter = {\n",
+ " \"type\": \"GeometryFilter\",\n",
+ " \"field_name\": \"geometry\",\n",
+ " \"config\": geometry\n",
+ "}\n",
+ "\n",
+ "# get images acquired within a date range\n",
+ "date_range_filter = {\n",
+ " \"type\": \"DateRangeFilter\",\n",
+ " \"field_name\": \"acquired\",\n",
+ " \"config\": {\n",
+ " \"gte\":\"2023-01-15T00:00:00Z\",\n",
+ " \"lte\":\"2023-01-17T00:00:00Z\"\n",
+ " }\n",
+ "}\n",
+ "\n",
+ "# only get images which have >50% clear pixels\n",
+ "cloud_cover_filter = {\n",
+ " \"type\": \"RangeFilter\",\n",
+ " \"field_name\": \"clear_percent\",\n",
+ " \"config\": {\n",
+ " \"gt\": 50\n",
+ " }\n",
+ "}\n",
+ "\n",
+ "# combine our geo, date, cloud filters\n",
+ "combined_filter = {\n",
+ " \"type\": \"AndFilter\",\n",
+ " \"config\": [geometry_filter, date_range_filter, cloud_cover_filter]\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "item_type = \"PSScene\"\n",
+ "\n",
+ "# API request object\n",
+ "search_request = {\n",
+ " \"item_types\": [item_type], \n",
+ " \"filter\": combined_filter\n",
+ "}\n",
+ "search_request"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "\n",
+ "# fire off the POST request\n",
+ "search_result = \\\n",
+ " requests.post(\n",
+ " 'https://api.planet.com/data/v1/quick-search',\n",
+ " auth=HTTPBasicAuth(PLANET_API_KEY, ''),\n",
+ " json=search_request)\n",
+ "\n",
+ "# extract image IDs only\n",
+ "image_ids = [feature['id'] for feature in search_result.json()['features']]\n",
+ "print(len(image_ids))"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "search_result.json()['_links']"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Requests example\n",
+ "\n",
+ "In this notebook, we will be using `requests` to communicate with the orders v2 API. First, we will check our orders list to make sure authentication and communication is working as expected.\n",
+ "\n",
+ "We want to get a response code of `200` from this API call. To troubleshoot other response codes, see the [List Orders](https://developers.planet.com/apis/orders/reference/) AOI reference."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "auth = HTTPBasicAuth(PLANET_API_KEY, '')\n",
+ "response = requests.get(orders_url, auth=auth)\n",
+ "response"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Ordering\n",
+ "\n",
+ "In this example, we will order two `PSScene` analytic images. For variations on this kind of order, see [Ordering Data](https://developers.planet.com/apis/orders/scenes/).\n",
+ "\n",
+ "In this order, we request an `analytic` bundle. A bundle is a group of assets for an item. The `analytic` bundle for the `PSScene` item contains 3 assets: the analytic image, the analytic xml file, and the udm. See the [Product bundles reference](https://developers.planet.com/docs/orders/product-bundles-reference/) to learn about other bundles and other items."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now we will list the names of orders we have created thus far. Your list may be empty if you have not created an order yet."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "orders = response.json()['orders']\n",
+ "[r['name'] for r in orders[:5]]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Place Order"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# set content type to json\n",
+ "headers = {'content-type': 'application/json'}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# define products part of order\n",
+ "single_product = [\n",
+ " {\n",
+ " \"item_ids\": [\"20220628_183020_20_248c\"],\n",
+ " \"item_type\": \"PSScene\",\n",
+ " \"product_bundle\": \"analytic_udm2\"\n",
+ " }\n",
+ "]\n",
+ "\n",
+ "same_src_products = [\n",
+ " {\n",
+ " \"item_ids\": [\"20151119_025740_0c74\",\n",
+ " \"20151119_025739_0c74\"],\n",
+ " \"item_type\": \"PSScene\",\n",
+ " \"product_bundle\": \"analytic_udm2\"\n",
+ " }\n",
+ "]\n",
+ "\n",
+ "multi_src_products = [\n",
+ " {\n",
+ " \"item_ids\": [\"20151119_025740_0c74\"],\n",
+ " \"item_type\": \"PSScene\",\n",
+ " \"product_bundle\": \"analytic_udm2\"\n",
+ " },\n",
+ " {\n",
+ " \"item_ids\": [\"20220628_183020_20_248c\"],\n",
+ " \"item_type\": \"PSScene\",\n",
+ " \"product_bundle\": \"visual\"\n",
+ " },\n",
+ " \n",
+ "]"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "product = [\n",
+ " {\n",
+ " \"item_ids\": [image_ids[0]],\n",
+ " \"item_type\": \"PSScene\",\n",
+ " \"product_bundle\": \"v\"\n",
+ " }\n",
+ "]\n",
+ "\n",
+ "\n",
+ "request = { \n",
+ " \"name\":\"San Francisco\",\n",
+ " \"products\": product,\n",
+ " \"delivery\": {\"single_archive\": True, \"archive_type\": \"zip\"}\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def place_order(request, auth):\n",
+ " response = requests.post(orders_url, data=json.dumps(request), auth=auth, headers=headers)\n",
+ " print(response.json())\n",
+ " order_id = response.json()['id']\n",
+ " print(order_id)\n",
+ " order_url = orders_url + '/' + order_id\n",
+ " return order_url"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "order_url = place_order(request, auth)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Poll for Order Success"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def poll_for_success(order_url, auth, num_loops=30):\n",
+ " count = 0\n",
+ " while(count < num_loops):\n",
+ " count += 1\n",
+ " r = requests.get(order_url, auth=auth)\n",
+ " response = r.json()\n",
+ " state = response['state']\n",
+ " print(state)\n",
+ " end_states = ['success', 'failed', 'partial']\n",
+ " if state in end_states:\n",
+ " break\n",
+ " time.sleep(30)\n",
+ " \n",
+ "poll_for_success(order_url, auth)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# define the clip tool\n",
+ "clip = {\n",
+ " \"clip\": {\n",
+ " \"aoi\": geometry\n",
+ " }\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "bandmath = {\n",
+ " \"bandmath\": {\n",
+ " \"b1\": \"b1\",\n",
+ " \"b2\": \"b2\",\n",
+ " \"b3\": \"b3\",\n",
+ " \"b4\": \"b4\",\n",
+ " \"b5\": \"(b4 - b3) / (b4 + b3)\",\n",
+ " \"pixel_type\": \"32R\",\n",
+ " }\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "tool_request = { \n",
+ " \"name\":\"San Francisco Clipped Bandmath\",\n",
+ " \"products\": product,\n",
+ " \"tools\": [clip, bandmath],\n",
+ " \"delivery\": {\"single_archive\": True, \"archive_type\": \"zip\"}\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "tool_order_url = place_order(tool_request, auth)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "order_url = orders_url + '/396a327c-4d52-41a5-9c81-6ee91166e59e'\n",
+ "\n"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### View Results\n",
+ "Now lets review our previous order and download it"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "requests.get(order_url, auth=auth).json()['state']"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "r = requests.get(order_url, auth=auth)\n",
+ "response = r.json()\n",
+ "results = response['_links']['results']\n"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "[r['name'] for r in results]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Download\n",
+ "\n",
+ "### Downloading each asset individually"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def download_results(results, overwrite=False):\n",
+ " results_urls = [r['location'] for r in results]\n",
+ " results_names = [r['name'] for r in results]\n",
+ " print('{} items to download'.format(len(results_urls)))\n",
+ " \n",
+ " for url, name in zip(results_urls, results_names):\n",
+ " path = pathlib.Path(os.path.join('data', name))\n",
+ " \n",
+ " if overwrite or not path.exists():\n",
+ " print('downloading {} to {}'.format(name, path))\n",
+ " r = requests.get(url, allow_redirects=True)\n",
+ " path.parent.mkdir(parents=True, exist_ok=True)\n",
+ " open(path, 'wb').write(r.content)\n",
+ " else:\n",
+ " print('{} already exists, skipping {}'.format(path, name))"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "download_results(results)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Subscription API\n",
+ "#### Creating a subscription\n",
+ "[Here](https://developers.planet.com/docs/subscriptions/delivery/) is more info on our various cloud delivery options"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# set content type to json\n",
+ "headers = {'content-type': 'application/json'}\n",
+ "\n",
+ "# set your delivery details\n",
+ "BUCKET_NAME = 'subscriptions_api_demo'\n",
+ "GCS_CREDENTIALS= #64-bit string"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "request = {\n",
+ " \"name\": \"Tampa_Bay\",\n",
+ " \"source\": {\n",
+ " \"type\": \"catalog\",\n",
+ " \"parameters\": {\n",
+ " \"geometry\": {\n",
+ " \"coordinates\": [[[-82.775,27.48],[-82.365,27.48],[-82.365,28.07],[-82.775,28.07],[-82.775,27.48]]],\n",
+ " \"type\": \"Polygon\"\n",
+ " },\n",
+ " \"start_time\": \"2023-01-01T00:00:00Z\",\n",
+ " \"end_time\": \"2023-03-31T00:00:00Z\",\n",
+ " \"item_types\": [\"PSScene\"],\n",
+ " \"asset_types\": [\"ortho_analytic_4b\"]\n",
+ " }\n",
+ " },\n",
+ " \"delivery\": { \n",
+ " \"type\": \"google_cloud_storage\",\n",
+ " \"parameters\": {\n",
+ " \"bucket\": BUCKET_NAME,\n",
+ " \"credentials\": GCS_CREDENTIALS,\n",
+ " \n",
+ " }\n",
+ " }\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def place_subscription(request, auth):\n",
+ " response = requests.post(subscriptions_url, data=json.dumps(request), auth=auth, headers=headers)\n",
+ " print(response.json())\n",
+ " subscriptions_id = response.json()['id']\n",
+ " print(subscriptions_id)\n",
+ " subscription_url = subscriptions_url + '/' + subscriptions_id\n",
+ " return subscription_url"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 3 (ipykernel)",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.12.3"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 4
+}
diff --git a/jupyter-notebooks/trainings/REST-APIs/sf.geojson b/jupyter-notebooks/trainings/REST-APIs/sf.geojson
new file mode 100644
index 0000000..e1468d9
--- /dev/null
+++ b/jupyter-notebooks/trainings/REST-APIs/sf.geojson
@@ -0,0 +1 @@
+{"type":"FeatureCollection","features":[{"type":"Feature","properties":{},"geometry":{"coordinates":[[[-122.53823337117416,37.81131273011326],[-122.53823337117416,37.69214939615783],[-122.3458817390349,37.69214939615783],[-122.3458817390349,37.81131273011326],[-122.53823337117416,37.81131273011326]]],"type":"Polygon"}}]}
\ No newline at end of file
diff --git a/jupyter-notebooks/trainings/Tools-and-Toolchains/tools_and_toolchains.ipynb b/jupyter-notebooks/trainings/Tools-and-Toolchains/tools_and_toolchains.ipynb
new file mode 100644
index 0000000..000fd9b
--- /dev/null
+++ b/jupyter-notebooks/trainings/Tools-and-Toolchains/tools_and_toolchains.ipynb
@@ -0,0 +1,765 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Tools and Toolchains\n",
+ "\n",
+ "This notebook demonstrates using tools and toolchains when ordering with the orders api. Specifically, this notebook demonstrates the following toolchains:\n",
+ " - [clip](#clip)\n",
+ " - [bandmath](#bandmath)\n",
+ " - [toar](#toar)\n",
+ " - [composite](#composite)\n",
+ " - [clip -> bandmath](#clip_bandmath)\n",
+ " - [toar -> reproject -> tile](#toar_reproject_tile)\n",
+ "\n",
+ "For background on ordering and downloading with the orders api, review the REST API + Python training.\n",
+ "\n",
+ "Reference information can be found at [Tools & toolchains](https://developers.planet.com/docs/orders/tools-toolchains/).\n",
+ "\n",
+ "## Setup"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import json\n",
+ "import os\n",
+ "import pathlib\n",
+ "import time\n",
+ "\n",
+ "import numpy as np\n",
+ "import rasterio\n",
+ "from rasterio.plot import show\n",
+ "import requests\n",
+ "from requests.auth import HTTPBasicAuth"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# API Key stored as an env variable\n",
+ "PLANET_API_KEY = os.getenv('PL_API_KEY')\n",
+ "#PLANET_API_KEY = \"\"\n",
+ "orders_url = 'https://api.planet.com/compute/ops/orders/v2'\n",
+ "\n",
+ "# set up requests to work with api\n",
+ "auth = HTTPBasicAuth(PLANET_API_KEY, '')\n",
+ "headers = {'content-type': 'application/json'}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# define products part of order\n",
+ "single_product = [\n",
+ " {\n",
+ " \"item_ids\": [\"20151119_025740_0c74\"],\n",
+ " \"item_type\": \"PSScene\",\n",
+ " \"product_bundle\": \"analytic_udm2\"\n",
+ " }\n",
+ "]\n",
+ "\n",
+ "single_product_rgb = [\n",
+ " {\n",
+ " \"item_ids\": [\"20151119_025740_0c74\"],\n",
+ " \"item_type\": \"PSScene\",\n",
+ " \"product_bundle\": \"visual\"\n",
+ " }\n",
+ "]\n",
+ "\n",
+ "same_src_products = [\n",
+ " {\n",
+ " \"item_ids\": [\"20151119_025740_0c74\",\n",
+ " \"20151119_025739_0c74\"],\n",
+ " \"item_type\": \"PSScene\",\n",
+ " \"product_bundle\": \"visual\"\n",
+ " }\n",
+ "]\n",
+ "\n",
+ "multi_src_products = [\n",
+ " {\n",
+ " \"item_ids\": [\"20151119_025740_0c74\"],\n",
+ " \"item_type\": \"PSScene\",\n",
+ " \"product_bundle\": \"analytic_udm2\"\n",
+ " },\n",
+ " {\n",
+ " \"item_ids\": [\"20220628_183020_20_248c\"],\n",
+ " \"item_type\": \"PSScene\",\n",
+ " \"product_bundle\": \"analytic_8b_udm2\"\n",
+ " },\n",
+ " \n",
+ "]"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# define helpful functions for submitting, polling, and downloading an order\n",
+ "def place_order(request, auth):\n",
+ " response = requests.post(orders_url, data=json.dumps(request), auth=auth, headers=headers)\n",
+ " print(response)\n",
+ " \n",
+ " if not response.ok:\n",
+ " raise Exception(response.content)\n",
+ "\n",
+ " order_id = response.json()['id']\n",
+ " print(order_id)\n",
+ " order_url = orders_url + '/' + order_id\n",
+ " return order_url\n",
+ "\n",
+ "def poll_for_success(order_url, auth, num_loops=50):\n",
+ " count = 0\n",
+ " while(count < num_loops):\n",
+ " count += 1\n",
+ " r = requests.get(order_url, auth=auth)\n",
+ " response = r.json()\n",
+ " state = response['state']\n",
+ " print(state)\n",
+ " success_states = ['success', 'partial']\n",
+ " if state == 'failed':\n",
+ " raise Exception(response)\n",
+ " elif state in success_states:\n",
+ " break\n",
+ " \n",
+ " time.sleep(10)\n",
+ " \n",
+ "def download_order(order_url, auth, overwrite=False):\n",
+ " r = requests.get(order_url, auth=auth)\n",
+ " print(r)\n",
+ "\n",
+ " response = r.json()\n",
+ " results = response['_links']['results']\n",
+ " results_urls = [r['location'] for r in results]\n",
+ " results_names = [r['name'] for r in results]\n",
+ " results_paths = [pathlib.Path(os.path.join('data', n)) for n in results_names]\n",
+ " print('{} items to download'.format(len(results_urls)))\n",
+ " \n",
+ " for url, name, path in zip(results_urls, results_names, results_paths):\n",
+ " if overwrite or not path.exists():\n",
+ " print('downloading {} to {}'.format(name, path))\n",
+ " r = requests.get(url, allow_redirects=True)\n",
+ " path.parent.mkdir(parents=True, exist_ok=True)\n",
+ " open(path, 'wb').write(r.content)\n",
+ " else:\n",
+ " print('{} already exists, skipping {}'.format(path, name))\n",
+ " \n",
+ " return dict(zip(results_names, results_paths))"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Tool Demos\n",
+ "\n",
+ "### No Processing (reference)\n",
+ "\n",
+ "We will order and download the unprocessed image for comparison with the output of the toolchains defined below."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "request = {\n",
+ " \"name\": \"no processing\",\n",
+ " \"products\": single_product_rgb,\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "order_url = place_order(request, auth)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Clip\n",
+ "\n",
+ "\n",
+ "Clipping is likely the most common tool that will be used. It allows us to only download the pixels we are interested in."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "clip_aoi = {\n",
+ " \"type\":\"Polygon\",\n",
+ " \"coordinates\":[[[94.81858044862747,15.858073043526062],\n",
+ " [94.86242249608041,15.858073043526062],\n",
+ " [94.86242249608041,15.894323164978303],\n",
+ " [94.81858044862747,15.894323164978303],\n",
+ " [94.81858044862747,15.858073043526062]]]\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# define the clip tool\n",
+ "clip = {\n",
+ " \"clip\": {\n",
+ " \"aoi\": clip_aoi\n",
+ " }\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# create an order request with the clipping tool\n",
+ "request_clip = {\n",
+ " \"name\": \"just clip\",\n",
+ " \"products\": single_product_rgb,\n",
+ " \"tools\": [clip]\n",
+ "}\n",
+ "\n",
+ "request_clip"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "clip_order_url = place_order(request_clip, auth)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Band Math\n",
+ "\n",
+ "\n",
+ "To demonstrate band math we will order an NDVI image."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "bandmath = {\n",
+ " \"bandmath\": {\n",
+ " \"pixel_type\": \"32R\",\n",
+ " \"b1\": \"(b4 - b3) / (b4 + b3)\"\n",
+ " }\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "bandmath_request = {\n",
+ " \"name\": \"band math\",\n",
+ " \"products\": single_product,\n",
+ " \"tools\": [bandmath]\n",
+ "}\n",
+ "bandmath_request"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "bandmath_order_url = place_order(bandmath_request, auth)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### TOAR\n",
+ "\n",
+ "\n",
+ "The `toar` tool converts imagery to Top of Atmosphere Reflectance."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "toar = {\n",
+ " \"toar\": {\n",
+ " \"scale_factor\": 10000\n",
+ " }\n",
+ "}\n",
+ "toar_request = {\n",
+ " \"name\": \"toar\",\n",
+ " \"products\": single_product,\n",
+ " \"tools\": [toar]\n",
+ "}\n",
+ "toar_request"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "toar_order_url = place_order(toar_request, auth)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Composite\n",
+ "\n",
+ "\n",
+ "The composite tool combines multiple images into one image, similar to mosaicing. The input images must have the same band configuration, and that band configuration will be propagated to the output image."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# composite = { \n",
+ "# \"composite\":{ \"group_by\": \"strip_id\" \n",
+ "# }\n",
+ "# }\n",
+ "\n",
+ "composite = { \n",
+ " \"composite\":{ \n",
+ " }\n",
+ "}\n",
+ "\n",
+ "composite_request = {\n",
+ " \"name\": \"composite\",\n",
+ " \"products\": same_src_products,\n",
+ " \"tools\": [composite]\n",
+ "}\n",
+ "\n",
+ "composite_request"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "composite_order_url = place_order(composite_request, auth)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Combined Tools - Clip and Band Math\n",
+ "\n",
+ "\n",
+ "This toolchain demonstrates how we can combine the clipping tool with the NDVI band math tool to only process and download the NDVI values for pixels we are interested in. Combining tools is as simple as combining the tool definitions in a list in the order request."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "clip_bandmath_request = {\n",
+ " \"name\": \"clip and bandmath\",\n",
+ " \"products\": single_product,\n",
+ " \"tools\": [clip, bandmath]\n",
+ "}\n",
+ "clip_bandmath_request"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "clip_bandmath_order_url = place_order(clip_bandmath_request, auth)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## TOAR, Reproject, Tile\n",
+ "\n",
+ "\n",
+ "For a more complicated example, we will convert the pixels to reflectance, project them to WGS84, and then tile them."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "reproject = {\n",
+ " \"reproject\": {\n",
+ " \"projection\": \"WGS84\",\n",
+ " \"kernel\": \"cubic\"\n",
+ " }\n",
+ "}\n",
+ "\n",
+ "tile = {\n",
+ " \"tile\": {\n",
+ " \"tile_size\": 1232,\n",
+ " \"origin_x\": -180,\n",
+ " \"origin_y\": -90,\n",
+ " \"pixel_size\": 0.000027056277056,\n",
+ " \"name_template\": \"C1232_30_30_{tilex:04d}_{tiley:04d}\"\n",
+ " }\n",
+ "}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "trt_request = {\n",
+ " \"name\": \"toar reproject tile\",\n",
+ " \"products\": single_product,\n",
+ " \"tools\": [toar, reproject, tile]\n",
+ "}\n",
+ "trt_request"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "trt_order_url = place_order(trt_request, auth)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Visualize"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# define helpful functions for visualizing downloaded imagery\n",
+ "def show_analytic(img_file):\n",
+ " with rasterio.open(img_file) as src:\n",
+ " b,g,r,n = src.read()\n",
+ "\n",
+ " rgb = np.stack((r,g,b), axis=0)\n",
+ " show(rgb/rgb.max())\n",
+ " \n",
+ "def show_rgb(img_file):\n",
+ " with rasterio.open(img_file) as src:\n",
+ " show(src)\n",
+ " \n",
+ " \n",
+ "def show_gray(img_file):\n",
+ " with rasterio.open(img_file) as src:\n",
+ " g = src.read(1)\n",
+ " show(g/g.max())"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "#### No processing"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "poll_for_success(order_url, auth)\n",
+ "downloaded_files = download_order(order_url, auth)\n",
+ "img_file = next(downloaded_files[d] for d in downloaded_files\n",
+ " if d.endswith('3B_Visual.tif'))\n",
+ "print(img_file)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "show_rgb(img_file)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "#### Clipped"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "poll_for_success(clip_order_url, auth)\n",
+ "downloaded_clip_files = download_order(clip_order_url, auth)\n",
+ "clip_img_file = next(downloaded_clip_files[d] for d in downloaded_clip_files\n",
+ " if d.endswith('_3B_Visual_clip.tif'))\n",
+ "clip_img_file"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "show_rgb(clip_img_file)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "#### Bandmath"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "poll_for_success(bandmath_order_url, auth)\n",
+ "downloaded_bandmath_files = download_order(bandmath_order_url, auth)\n",
+ "bandmath_img_file = next(downloaded_bandmath_files[d] for d in downloaded_bandmath_files\n",
+ " if d.endswith('_bandmath.tif'))\n",
+ "bandmath_img_file"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "show_rgb(img_file)\n",
+ "show_gray(bandmath_img_file)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "#### TOAR"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "poll_for_success(toar_order_url, auth)\n",
+ "downloaded_toar_files = download_order(toar_order_url, auth)\n",
+ "toar_img_file = next(downloaded_toar_files[d] for d in downloaded_toar_files\n",
+ " if d.endswith('_toar.tif'))\n",
+ "toar_img_file"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "show_rgb(img_file)\n",
+ "show_analytic(toar_img_file)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "#### Composite"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "poll_for_success(composite_order_url, auth)\n",
+ "downloaded_composite_files = download_order(composite_order_url, auth)\n",
+ "composite_file = next(downloaded_composite_files[d] for d in downloaded_composite_files\n",
+ " if d.endswith('composite.tif'))\n",
+ "composite_file"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "show_rgb(composite_file)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "#### Clip and Bandmath"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "poll_for_success(clip_bandmath_order_url, auth)\n",
+ "downloaded_clip_bandmath_files = download_order(clip_bandmath_order_url, auth)\n",
+ "clip_bandmath_file = next(downloaded_clip_bandmath_files[d] for d in downloaded_clip_bandmath_files\n",
+ " if d.endswith('_clip_bandmath.tif'))\n",
+ "clip_bandmath_file"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "show_gray(clip_bandmath_file)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "#### TOAR/Reproject/Tile"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "poll_for_success(trt_order_url, auth)\n",
+ "downloaded_trt_files = download_order(trt_order_url, auth)\n",
+ "tile_files = list(d for d in downloaded_trt_files.values()\n",
+ " if d.name.startswith('C1232_30_30_'))"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "for f in tile_files[:8]:\n",
+ " show_rgb(f)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "test_file = tile_files[0]"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "!gdalinfo $test_file"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "!gdalinfo $img_file"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 3 (ipykernel)",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.12.3"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 4
+}