0

Running a java job to read Avro files and have been getting errors. Looking for help on this -

Here is the code -

// Get Avro Schema
String schemaJson = getSchema(options.getAvroSchema());
Schema schema = new Schema.Parser().parse(schemaJson);

// Check schema field types before starting the Dataflow job
checkFieldTypes(schema);

// Create the Pipeline object with the options we defined above.
Pipeline pipeline = Pipeline.create(options);
String bqStr = getBQString(options);
// TableSchema ts = BigQueryAvroUtils.getTableSchema(User.SCHEMA$);
// Convert Avro To CSV
PCollection<GenericRecord> records =
    pipeline.apply(
        "Read Avro files",
        AvroIO.readGenericRecords(schema)
            .from(options.getInputFile()));

records
    .apply(
        "Convert Avro to CSV formatted data",
        ParDo.of(new ConvertAvroToCsv(schemaJson, options.getCsvDelimiter())))
    .apply(
        "Write CSV formatted data",
        TextIO.write().to(options.getOutput())
            .withSuffix(".csv"));

records.apply(
      "Write to BigQuery",
      BigQueryIO.write()
          .to(bqStr)
          .withJsonSchema(schemaJson)
          .withWriteDisposition(WRITE_APPEND)
          .withCreateDisposition(CREATE_IF_NEEDED)
          .withFormatFunction(TABLE_ROW_PARSER));
  // [END bq_write]

Here is the error that I see -

2020-06-01 13:14:41 ERROR MonitoringUtil$LoggingHandler:99 - 2020-06-01T07:44:39.240Z: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to org.apache.avro.specific.SpecificRecord
        at com.example.AvroToCsv$1.apply(AvroToCsv.java:1)
        at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:76)

2020-06-01 13:14:52 ERROR MonitoringUtil$LoggingHandler:99 - 2020-06-01T07:44:48.956Z: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to org.apache.avro.specific.SpecificRecord
        at com.example.AvroToCsv$1.apply(AvroToCsv.java:1)
        at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:76)

2020-06-01 13:15:03 ERROR MonitoringUtil$LoggingHandler:99 - 2020-06-01T07:44:58.811Z: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to org.apache.avro.specific.SpecificRecord
        at com.example.AvroToCsv$1.apply(AvroToCsv.java:1)
        at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:76)

2020-06-01 13:15:15 ERROR MonitoringUtil$LoggingHandler:99 - 2020-06-01T07:45:10.673Z: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to org.apache.avro.specific.SpecificRecord
        at com.example.AvroToCsv$1.apply(AvroToCsv.java:1)
        at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:76)
Kenn Knowles
  • 5,838
  • 18
  • 22

1 Answers1

0

The error is in your TABLE_ROW_PARSER function. It appears to be casting an Avro GenericRecord to a SpecificRecord.

The line in PrepareWrite that is failing is here. That line calls the format function you provide. The format function must convert each input element into a JSON TableRow. It is probably better to use withAvroFormatFunction for efficiency.

Kenn Knowles
  • 5,838
  • 18
  • 22