0

I want to use this method from Databricks:

from_avro($"value", "t-value", schemaRegistryAddr).as("value"))

Which allows you to simply pass the schema registry URL, and it will find the right schema and parse it.

The problem I see is this library is only available in Databricks environment, (we are using Databricks but they documented this method for notebooks) so I cannot compile a solution using this method (or I dont know how). The idea to use this library was implementing a facade object in a different project and include it as a Provider:

object functions {
  def from_avro(data: Column, subject: String, schemaRegistryAddress: String): Column = {
    null
  }

But when I am trying to use it with my spark code, the IDE complains 'cannot overloaded method from_avro' because both methods (the code implemented by me and the public spark library, that has different signature) have the same paths 'org.apache.spark.sql.avro.functions'. Is there a way to use that method? I dont know if there is a technical solution for this.

MrElephant
  • 302
  • 4
  • 26
  • Does this answer your question? [Unable to find Databricks spark sql avro shaded jars in any public maven repository](https://stackoverflow.com/questions/71069226/unable-to-find-databricks-spark-sql-avro-shaded-jars-in-any-public-maven-reposit) – Alex Ott Jul 31 '22 at 11:03
  • it seems there is not a public jar supporting this method and the usage of this method is not posible – MrElephant Sep 12 '22 at 11:49

0 Answers0