|
6 | 6 | "source": [ |
7 | 7 | "## Cloudless Mosaic\n", |
8 | 8 | "\n", |
9 | | - "In this tutorial, you'll learn how to constructs a *cloudless mosaic* (also known as a composite) from a time series of satellite images. The tutorial covers the following steps:\n", |
| 9 | + "This tutorial constructs a *cloudless mosaic* (also known as a composite) from a time series of satellite images. We'll see the following:\n", |
10 | 10 | "\n", |
11 | | - "* [Find a time series of images at a particular point on Earth](#Discover-data)\n", |
12 | | - "* [Stack those images together into a single array](#Stack-images)\n", |
13 | | - "* [Compute the cloudless mosaic by taking a median](#Median-composite)\n", |
14 | | - "* [Create mosaics after grouping the data](#Monthly-composite)\n", |
| 11 | + "* Find a time series of images at a particular point on Earth\n", |
| 12 | + "* Stack those images together into a single array\n", |
| 13 | + "* Compute the cloudless mosaic by taking a median\n", |
| 14 | + "* Visualize the results\n", |
15 | 15 | "\n", |
16 | | - "This example uses [Sentinel-2 Level-2A](https://planetarycomputer.microsoft.com/dataset/sentinel-2-l2a) data. The techniques used here work equally well with other remote-sensing datasets.\n", |
17 | | - "\n", |
18 | | - "---" |
| 16 | + "This example uses [Sentinel-2 Level-2A](https://planetarycomputer.microsoft.com/dataset/sentinel-2-l2a) data. The techniques used here apply equally well to other remote-sensing datasets." |
19 | 17 | ] |
20 | 18 | }, |
21 | 19 | { |
|
43 | 41 | "source": [ |
44 | 42 | "### Create a Dask cluster\n", |
45 | 43 | "\n", |
46 | | - "This example requires processing a large amms match our seaunt of data. To cut down on the execution time, use a Dask cluster to do the computation in parallel, adaptively scaling to add and remove workers as needed. See [Scale With Dask](../quickstarts/scale-with-dask.ipynb) for more on using Dask." |
| 44 | + "We're going to process a large amount of data. To cut down on the execution time, we'll use a Dask cluster to do the computation in parallel, adaptively scaling to add and remove workers as needed. See [Scale With Dask](../quickstarts/scale-with-dask.ipynb) for more on using Dask." |
47 | 45 | ] |
48 | 46 | }, |
49 | 47 | { |
|
60 | 58 | } |
61 | 59 | ], |
62 | 60 | "source": [ |
63 | | - "cluster = GatewayCluster() # creates the Dask Scheduler - might take a minute.\n", |
| 61 | + "cluster = GatewayCluster() # Creates the Dask Scheduler. Might take a minute.\n", |
64 | 62 | "\n", |
65 | 63 | "client = cluster.get_client()\n", |
66 | 64 | "\n", |
|
74 | 72 | "source": [ |
75 | 73 | "### Discover data\n", |
76 | 74 | "\n", |
77 | | - "In this example, the area of interest is located near Redmond, Washington. It is defined as a GeoJSON object." |
| 75 | + "In this example, we define our area of interest as a GeoJSON object. It's near Redmond, Washington." |
78 | 76 | ] |
79 | 77 | }, |
80 | 78 | { |
|
102 | 100 | "cell_type": "markdown", |
103 | 101 | "metadata": {}, |
104 | 102 | "source": [ |
105 | | - "Use `pystac_client` to search the Planetary Computer's STAC endpoint for items matching your query parameters:" |
| 103 | + "Using `pystac_client` we can search the Planetary Computer's STAC endpoint for items matching our query parameters." |
106 | 104 | ] |
107 | 105 | }, |
108 | 106 | { |
|
137 | 135 | "cell_type": "markdown", |
138 | 136 | "metadata": {}, |
139 | 137 | "source": [ |
140 | | - "As you can see, there are 138 items that match your search requirements in terms of location, time, and cloudiness. Those items will still have *some* clouds over portions of the scenes, though. \n", |
141 | | - "\n", |
142 | | - "### Stack images\n", |
143 | | - "\n", |
144 | | - "To create a cloudless mosaic, first, load the data into an [xarray](https://xarray.pydata.org/en/stable/) DataArray using [stackstac](https://stackstac.readthedocs.io/):" |
| 138 | + "So 138 items match our search requirements, over space, time, and cloudiness. Those items will still have *some* clouds over portions of the scenes, though. To create our cloudless mosaic, we'll load the data into an [xarray](https://xarray.pydata.org/en/stable/) DataArray using [stackstac](https://stackstac.readthedocs.io/) and then reduce the time-series of images down to a single image." |
145 | 139 | ] |
146 | 140 | }, |
147 | 141 | { |
|
166 | 160 | " signed_items.append(planetary_computer.sign(item).to_dict())" |
167 | 161 | ] |
168 | 162 | }, |
169 | | - { |
170 | | - "cell_type": "markdown", |
171 | | - "metadata": {}, |
172 | | - "source": [ |
173 | | - "Next, reduce the time series of images down to a single image:" |
174 | | - ] |
175 | | - }, |
176 | 163 | { |
177 | 164 | "cell_type": "code", |
178 | 165 | "execution_count": 6, |
|
1484 | 1471 | "cell_type": "markdown", |
1485 | 1472 | "metadata": {}, |
1486 | 1473 | "source": [ |
1487 | | - "Since the data matching your query isn't too large, you can persist it in distributed memory. Once it is stored in memory, subsequent operations will be much faster." |
| 1474 | + "Since the data matching our query isn't too large we can persist it in distributed memory. Once in memory, subsequent operations will be much faster." |
1488 | 1475 | ] |
1489 | 1476 | }, |
1490 | 1477 | { |
|
1502 | 1489 | "source": [ |
1503 | 1490 | "### Median composite\n", |
1504 | 1491 | "\n", |
1505 | | - "Using regular xarray operations, you can [compute the median](http://xarray.pydata.org/en/stable/generated/xarray.DataArray.median.html) over the time dimension. Under the assumption that clouds are transient, the composite shouldn't contain (many) clouds, since clouds shouldn't be the median pixel value at that point over many images.\n", |
| 1492 | + "Using normal xarray operations, we can [compute the median](http://xarray.pydata.org/en/stable/generated/xarray.DataArray.median.html) over the time dimension. Under the assumption that clouds are transient, the composite shouldn't contain (many) clouds, since they shouldn't be the median pixel value at that point over many images.\n", |
1506 | 1493 | "\n", |
1507 | 1494 | "This will be computed in parallel on the cluster (make sure to open the Dask Dashboard using the link printed out above)." |
1508 | 1495 | ] |
|
1520 | 1507 | "cell_type": "markdown", |
1521 | 1508 | "metadata": {}, |
1522 | 1509 | "source": [ |
1523 | | - "Use Xarray-Spatial's `true_color` method to visualize your data by converting it to red/green/blue values." |
| 1510 | + "To visualize the data, we'll use xarray-spatial's `true_color` method to convert to red/green/blue values." |
1524 | 1511 | ] |
1525 | 1512 | }, |
1526 | 1513 | { |
|
1565 | 1552 | "source": [ |
1566 | 1553 | "### Monthly composite\n", |
1567 | 1554 | "\n", |
1568 | | - "Now suppose you don't want to combine images from different parts of the year (for example, you might not want to combine images from January that often include snow with images from July). Again using standard xarray syntax, you can create sets of per-month composites by grouping by month and then computing the median:" |
| 1555 | + "Now suppose we don't want to combine images from different parts of the year (for example, we might not want to combine images from January that often include snow with images from July). Again using standard xarray syntax, we can create set of per-month composites by grouping by month and then taking the median." |
1569 | 1556 | ] |
1570 | 1557 | }, |
1571 | 1558 | { |
|
1581 | 1568 | "cell_type": "markdown", |
1582 | 1569 | "metadata": {}, |
1583 | 1570 | "source": [ |
1584 | | - "Convert each of those arrays to a true-color image and plot the results as a grid:" |
| 1571 | + "Let's convert each of those arrays to a true-color image and plot the results as a grid." |
1585 | 1572 | ] |
1586 | 1573 | }, |
1587 | 1574 | { |
|
1617 | 1604 | "cell_type": "markdown", |
1618 | 1605 | "metadata": {}, |
1619 | 1606 | "source": [ |
1620 | | - "### Next steps\n", |
1621 | | - "\n", |
1622 | | - "To learn more about using the Planetary Computer's STAC API, see [Reading data from the STAC API](../quickstarts/reading-stac.ipynb). To learn more about Dask, see [Scaling with Dask](../quickstarts/scale-with-dask.ipynb).\n", |
| 1607 | + "### Learn more\n", |
1623 | 1608 | "\n", |
1624 | | - "Click on this link to go to the next notebook: [04 Geospatial Classification](04_Geospatial_Classification.ipynb)" |
| 1609 | + "To learn more about using the the Planetary Computer's STAC API, see [Reading data from the STAC API](../quickstarts/reading-stac.ipynb). To learn more about Dask, see [Scaling with Dask](../quickstarts/scale-with-dask.ipynb)." |
1625 | 1610 | ] |
1626 | 1611 | } |
1627 | 1612 | ], |
1628 | 1613 | "metadata": { |
1629 | 1614 | "kernelspec": { |
1630 | | - "display_name": "Python 3 (ipykernel)", |
| 1615 | + "display_name": "Python 3", |
1631 | 1616 | "language": "python", |
1632 | 1617 | "name": "python3" |
1633 | 1618 | }, |
|
1641 | 1626 | "name": "python", |
1642 | 1627 | "nbconvert_exporter": "python", |
1643 | 1628 | "pygments_lexer": "ipython3", |
1644 | | - "version": "3.9.1" |
| 1629 | + "version": "3.8.8" |
1645 | 1630 | }, |
1646 | 1631 | "widgets": { |
1647 | 1632 | "application/vnd.jupyter.widget-state+json": { |
|
0 commit comments