1

I am using AvroIO from Apache Beam with Spark Runner. I have defined a avro record with field

{
          "name" : "serviceDate",
          "type" : [ "null", {
            "type" : "int",
            "logicalType" : "date"
          } ],
          "doc" : "date",
          "default" : null
        }

Reading this data fails with java.lang.ClassCastException: java.lang.Integer cannot be cast to org.joda.time.LocalDate

I am using avro 1.8.2

Anuj J
  • 123
  • 2
  • 7
  • Anuj, can you provide more details, how your pipeline looks like? How do you use the AvroIO? – Anton May 14 '19 at 17:08
  • 2
    I found this to be an issue with Avro 1.8.2. Seems they missed implementing the logical types. Hive SerDe overcame this problem by explicitly handling logical types like date, decimal etc. – Anuj J May 27 '19 at 06:06
  • Thanks, I had the same Avro issue. Switching to Avro 1.9.0 solved the problem as Joda time is no longer used ([AVRO-2079](https://issues.apache.org/jira/browse/AVRO-2079)). – oskarryn Dec 29 '22 at 10:12

0 Answers0