I'm using flink 1.9.0 and I'm unable to import or get table files
I've tried importing different SBTs related to it
def main(args: Array[String]): Unit = {
val env = StreamExecutionEnvironment.getExecutionEnvironment
val tEnv = StreamTableEnvironment.create(env)
val tempSource = CsvTableSource.builder()
.path("/home/amulya/Desktop/csvForBroadcast/CSV.csv")
.fieldDelimiter(",")
.field("locationID", Types.STRING())
.field("temp", Types.DOUBLE())
.build()
tEnv.registerTableSource("Temperatures", tempSource)
val askTable = tEnv
.scan("Temperatures")
.where(" 'Temperature >= 50")
.select("'locationID, 'temp")
val stream = tEnv.toAppendStream[Events](askTable)
.print()
env.execute()
}
case class Events(locationID: String, temp: Long)
}
I've a simple data of CSV format:-
locationID,temp
"1",25
"2",25
"3",35
"4",45
"5",55
This is the error :-
Error:scalac: missing or invalid dependency detected while loading class file 'ScalaCaseClassSerializer.class'.
Could not access type SelfResolvingTypeSerializer in object org.apache.flink.api.common.typeutils.TypeSerializerConfigSnapshot,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'ScalaCaseClassSerializer.class' was compiled against an incompatible version of org.apache.flink.api.common.typeutils.TypeSerializerConfigSnapshot.
I'm trying to perform CEP on this basic data so as to get started with apache flink, any kind of help would be highly appreciated