0

I have the schema associated with a table to be created fetched from confluent schema-registry in below code:

private val avroSchema = schemaRegistryClient.getLatestSchemaMetadata("topicName").getSchema
private var sparkSchema = SchemaConverters.toSqlType(new Schema.Parser().parse(avroSchema))
sparkSchema=sparkSchema.dataType.asInstanceOf[StructType]

Now I'm trying to define a delta lake table which has the structure that is based on this schema. However I'm not sure how to go about the same. Any help appreciated.

Jacek Laskowski
  • 72,696
  • 27
  • 242
  • 420
Vikas J
  • 358
  • 1
  • 5
  • 17

1 Answers1

0

In Scala you can use the following:

for defining schema

val customSchema = 
StructType(
 Array(
  StructField("col1", StringType, true),
  StructField("col2", StringType, true),
  StructField("col3", StringType, true)
  )
)

for reading the table from the schema

val DF = 
 spark.read.format("csv")
  .option("delimiter","\t") //use a proper delimiter
  .schema(customSchema)
  .load("path")

while writing the table to a particular location you can specify the .format("delta") to

darekarsam
  • 181
  • 2
  • 12