Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

0.4.0 Docs Cleanup - 6 #522

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -93,11 +93,11 @@ as a [cluster library](https://docs.databricks.com/libraries/cluster-libraries.h
%pip install databricks-mosaic
```

Then enable it with
Then enable Mosaic (and namespace it in python) with

```python
from mosaic import enable_mosaic
enable_mosaic(spark, dbutils)
import mosaic as mos
mos.enable_mosaic(spark, dbutils)
```

### Scala
Expand Down
66 changes: 7 additions & 59 deletions docs/source/api/spatial-functions.rst
Original file line number Diff line number Diff line change
Expand Up @@ -238,58 +238,6 @@ st_bufferloop

Fig 1. ST_BufferLoop(wkt, 0.02, 0.04)

st_centroid2D [Deprecated]
**************************

.. function:: st_centroid2D(col)

Returns the x and y coordinates representing the centroid of the input geometry.

:param col: Geometry
:type col: Column
:rtype: Column: StructType[x: DoubleType, y: DoubleType]

:example:

.. tabs::
.. code-tab:: py

df = spark.createDataFrame([{'wkt': 'POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))'}])
df.select(st_centroid2D('wkt')).show()
+---------------------------------------+
|st_centroid(wkt) |
+---------------------------------------+
|{25.454545454545453, 26.96969696969697}|
+---------------------------------------+

.. code-tab:: scala

val df = List(("POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))")).toDF("wkt")
df.select(st_centroid2D(col("wkt"))).show()
+---------------------------------------+
|st_centroid(wkt) |
+---------------------------------------+
|{25.454545454545453, 26.96969696969697}|
+---------------------------------------+

.. code-tab:: sql

SELECT st_centroid2D("POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))")
+---------------------------------------+
|st_centroid(wkt) |
+---------------------------------------+
|{25.454545454545453, 26.96969696969697}|
+---------------------------------------+

.. code-tab:: r R

df <- createDataFrame(data.frame(wkt = "POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))"))
showDF(select(df, st_centroid2D(column("wkt"))), truncate=F)
+---------------------------------------+
|st_centroid(wkt) |
+---------------------------------------+
|{25.454545454545453, 26.96969696969697}|
+---------------------------------------+

st_centroid
*************
Expand Down Expand Up @@ -533,8 +481,8 @@ st_dimension
.. tabs::
.. code-tab:: py

>>> df = spark.createDataFrame([{'wkt': 'POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))'}])
>>> df.select(st_dimension('wkt')).show()
df = spark.createDataFrame([{'wkt': 'POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))'}])
df.select(st_dimension('wkt')).show()
+-----------------+
|st_dimension(wkt)|
+-----------------+
Expand All @@ -543,8 +491,8 @@ st_dimension

.. code-tab:: scala

>>> val df = List("POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))").toDF("wkt")
>>> df.select(st_dimension(col("wkt"))).show()
val df = List("POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))").toDF("wkt")
df.select(st_dimension(col("wkt"))).show()
+-----------------+
|st_dimension(wkt)|
+-----------------+
Expand All @@ -553,7 +501,7 @@ st_dimension

.. code-tab:: sql

>>> SELECT st_dimension("POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))")
SELECT st_dimension("POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))")
+-----------------+
|st_dimension(wkt)|
+-----------------+
Expand All @@ -562,8 +510,8 @@ st_dimension

.. code-tab:: r R

>>> df <- createDataFrame(data.frame(wkt = "POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))"))
>>> showDF(select(df, st_dimension(column("wkt"))))
df <- createDataFrame(data.frame(wkt = "POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10))"))
showDF(select(df, st_dimension(column("wkt"))))
+-----------------+
|st_dimension(wkt)|
+-----------------+
Expand Down
41 changes: 0 additions & 41 deletions docs/source/api/spatial-indexing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1488,44 +1488,3 @@ grid_geometrykloopexplode
.. raw:: html

</div>



mosaic_explode [Deprecated]
***************************

.. function:: mosaic_explode(geometry, resolution, keep_core_geometries)

This is an alias for :ref:`grid_tessellateexplode`


mosaicfill [Deprecated]
************************

.. function:: mosaicfill(geometry, resolution, keep_core_geometries)

This is an alias for :ref:`grid_tessellate`


point_index_geom [Deprecated]
******************************

.. function:: point_index_geom(point, resolution)

This is an alias for :ref:`grid_pointascellid`


point_index_lonlat [Deprecated]
********************************

.. function:: point_index_lonlat(point, resolution)

This is an alias for :ref:`grid_longlatascellid`


polyfill [Deprecated]
**********************

.. function:: polyfill(geom, resolution)

This is an alias for :ref:`grid_polyfill`
Binary file added docs/source/images/function_describe.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/images/functions_show.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
23 changes: 19 additions & 4 deletions docs/source/usage/automatic-sql-registration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,8 @@ To install Mosaic on your Databricks cluster, take the following steps:

#. Upload Mosaic jar to a dedicated fuse mount location. E.g. "dbfs:/FileStore/mosaic/jars/".

#. Create an init script that fetches the mosaic jar and copies it to databricks/jars.
#. Create an init script that fetches the mosaic jar and copies it to "databricks/jars".

You can also use the output from (0.4 series) python function :code:`setup_fuse_install`, e.g.
:code:`setup_fuse_intall(<to_fuse_dir>, jar_copy=True)` which can help to copy the JAR used in
the init script below.
Expand Down Expand Up @@ -73,17 +74,31 @@ To install Mosaic on your Databricks cluster, take the following steps:
Testing
*******

To test the installation, create a new Python notebook and run the following command:
To test the installation, create a new Python notebook and run the following commands (similar for :code:`grid_` and :code:`rst_`, not shown):

.. code-block:: python

spark.sql("""show functions""").where("startswith(function, 'st_')").display()
sql("""SHOW FUNCTIONS""").where("startswith(function, 'st_')").display()

You should see all the supported :code:`ST_` functions registered by Mosaic appear in the output.

You should see all the supported functions registered by Mosaic appear in the output.
.. figure:: ../images/functions_show.png
:figclass: doc-figure

Fig 1. Show Functions Example

.. note::
You may see some :code:`ST_` functions from other libraries, so pay close attention to the provider.

.. code-block:: python

sql("""DESCRIBE FUNCTION st_buffer""")

.. figure:: ../images/function_describe.png
:figclass: doc-figure

Fig 2. Describe Function Example

.. warning::
Mosaic 0.4.x SQL bindings for DBR 13 can register with Assigned clusters, but not Shared Access due to API changes,
more `here <https://docs.databricks.com/en/udf/index.html>`_.
Expand Down
4 changes: 2 additions & 2 deletions docs/source/usage/install-gdal.rst
Original file line number Diff line number Diff line change
Expand Up @@ -107,5 +107,5 @@ code at the top of the notebook:
GDAL 3.4.1, released 2021/12/27
.. note::
You can configure init script from default ubuntu GDAL (3.4.1) to `ubuntugis ppa <https://launchpad.net/~ubuntugis/+archive/ubuntu/ppa>`_ (3.4.3)
with :code:`setup_gdal(with_ubuntugis=True)`
You can configure init script from default ubuntu GDAL (3.4.1) to ubuntugis ppa @ https://launchpad.net/~ubuntugis/+archive/ubuntu/ppa (3.4.3)
with `setup_gdal(with_ubuntugis=True)`
2 changes: 1 addition & 1 deletion docs/source/usage/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ or from within a Databricks notebook using the :code:`%pip` magic command, e.g.
If you need to install Mosaic 0.3 series for DBR 12.2 LTS, e.g.

.. code-block::bash
.. code-block:: bash
%pip install "databricks-mosaic<0.4,>=0.3"
Expand Down
Loading