0

I am "translating" some Postgis code to Geomesa and I have some Postgis code like this:

select ST_Transform(ST_SetSRID(ST_Point(longitude, latitude), 4326), 27700)

which converts a point geometry from 4326 to 27700 for example.

On Geomesa-Spark-sql documentation https://www.geomesa.org/documentation/user/spark/sparksql_functions.html I can see ST_Point but I cannot find any equivalent ST_Transform function. Any idea?

Randomize
  • 8,651
  • 18
  • 78
  • 133

3 Answers3

1

I have used sedona library for the geoprocessing and it has the st_transform function which I have used and working fine so if you want you can use it. Please find below link for the official documentation - https://sedona.apache.org/api/sql/GeoSparkSQL-Function/#st_transform

Even Geomesa is now supporting the function - https://www.geomesa.org/documentation/3.1.2/user/spark/sparksql_functions.html#st-transform

yogesh garud
  • 336
  • 1
  • 8
0

For GeoMesa 1.x, 2.x, and the upcoming 3.0 release, there is not an ST_Transform presently. One could make their own UDF using GeoTools (or another library) to do the transformation.

Admittedly, this would require some work.

GeoJim
  • 1,320
  • 7
  • 12
0

I recently run with the same issue on Azure Databricks. I was able to do it manually installing the JAR library from here.

And then running the following Scala code.


%scala
    
import org.locationtech.jts.geom._
import org.locationtech.geomesa.spark.jts._
import org.locationtech.geomesa.spark.geotools._
    
import org.apache.spark.sql.types._
import org.apache.spark.sql.functions._
import spark.implicits._
    
spark.withJTS

data_points = (
  data_points
  .withColumn("geom", st_makePoint(col("LONGITUDE"), col("LATITUDE")))
  .withColumn("geom_5347", st_transform(col("geom"), lit("EPSG:4326"), lit("EPSG:5347")))
)

display(data_points)

Good luck.

andrés ab
  • 11
  • 1