Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

change type returned by st_x and xt_y to double #324

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions clouds/databricks/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,12 @@ All notable changes to this project will be documented in this file.

The format is inspired by [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).

## [2022.09.20] - 2022-09-21

### Accessors
#### Changed
- Change type returned by ST_X and ST_Y to Double

## [2022.09.20] - 2022-09-20

### All modules
Expand Down
6 changes: 3 additions & 3 deletions clouds/databricks/libraries/scala/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ PRODUCT ?= core
COMMON_DIR = $(ROOT_DIR)/../../common
SCALA_DIR ?= $(ROOT_DIR)
JAR_DIR ?= $(SCALA_DIR)/$(PRODUCT)/target/scala-2.12
JAR_DEPLOY_PATH = dbfs:/FileStore/jars-carto/$(DB_PREFIX)carto
JAR_DEPLOY_PATH = dbfs:/FileStore/jars-carto/$(DB_PREFIX)carto-$(PRODUCT)
SQL_PATH = $(JAR_DIR)/classes/sql/createUDFs.sql

include $(COMMON_DIR)/Makefile
Expand Down Expand Up @@ -86,8 +86,8 @@ test:

remove: check
echo "Removing libraries..."
databricks libraries uninstall --cluster-id $(DB_CLUSTER_ID) --jar $(JAR_DEPLOY_PATH)/analyticstoolbox-$(PRODUCT)-assembly-SNAPSHOT.jar
dbfs rm $(JAR_DEPLOY_PATH)/analyticstoolbox-$(PRODUCT)-assembly.jar
databricks libraries uninstall --cluster-id $(DB_CLUSTER_ID) --jar $(JAR_DEPLOY_PATH)/analyticstoolbox-$(PRODUCT)-assembly.jar
dbfs rm -r $(JAR_DEPLOY_PATH)/

clean:
echo "Cleaning libraries..."
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,13 +16,15 @@

package com.carto.analyticstoolbox.modules.accessors

import com.carto.analyticstoolbox.modules._
import com.azavea.hiveless.HUDF
import org.locationtech.geomesa.spark.jts.udf.GeometricAccessorFunctions
import org.locationtech.jts.geom.Geometry
import com.carto.analyticstoolbox.modules._
import org.locationtech.jts.geom.{Geometry, Point}

import java.lang
import java.{lang => jl}

class ST_X extends HUDF[Geometry, lang.Float] {
def function: Geometry => lang.Float = GeometricAccessorFunctions.ST_X
class ST_X extends HUDF[Geometry, jl.Double] {
def function: Geometry => jl.Double = {
case geom: Point => geom.getX
case _ => null
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -16,13 +16,15 @@

package com.carto.analyticstoolbox.modules.accessors

import com.carto.analyticstoolbox.modules._
import com.azavea.hiveless.HUDF
import org.locationtech.geomesa.spark.jts.udf.GeometricAccessorFunctions
import org.locationtech.jts.geom.Geometry
import com.carto.analyticstoolbox.modules._
import org.locationtech.jts.geom.{Geometry, Point}

import java.{lang => jl}

class ST_Y extends HUDF[Geometry, jl.Float] {
def function: Geometry => jl.Float = GeometricAccessorFunctions.ST_Y
class ST_Y extends HUDF[Geometry, jl.Double] {
def function: Geometry => jl.Double = {
case geom: Point => geom.getY
case _ => null
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
/*
* Copyright 2022 Azavea
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package com.carto.analyticstoolbox.modules.constructors

import com.carto.analyticstoolbox.{HiveTestEnvironment, TestTables}
import org.apache.spark.sql.Row
import org.scalatest.funspec.AnyFunSpec

class STConstructorsSpec extends AnyFunSpec with HiveTestEnvironment with TestTables {
describe("ST constructors functions spec") {
it("ST_POINT can be created from the result of ST_X and ST_Y") {
val df = ssc.sql(
"""WITH t AS(
| SELECT ST_POINT(1.5, 2.5) as point
|)
|SELECT ST_ASTEXT(ST_POINT(ST_X(point), ST_Y(point))) FROM t""".stripMargin
)
val result: Array[Row] = df.take(1)
result.head.get(0) shouldEqual "POINT (1.5 2.5)"
}
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
package com.carto.analyticstoolbox.modules.constructors

import org.scalatest.funspec.AnyFunSpec
import org.scalatest.matchers.should.Matchers.convertToAnyShouldWrapper

class ST_MakePointTest extends AnyFunSpec {
describe("ST_MakePoint") {
it("Should create a point with coordinates correctly") {
val point = new ST_MakePoint().function(2, 4)
point.getX shouldEqual 2
point.getY shouldEqual 4
}
}
}
2 changes: 1 addition & 1 deletion clouds/databricks/modules/doc/accessors/ST_X.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ If _geom_ is a `Point`, return the X coordinate of that point.

**Return type**

`Float`
`Double`

**Example**

Expand Down
2 changes: 1 addition & 1 deletion clouds/databricks/modules/doc/accessors/ST_Y.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ If _geom_ is a `Point`, return the Y coordinate of that point.

**Return type**

`Float`
`Double`

**Example**

Expand Down