0

I am writing Flink CEP program inside the Lagom's Microservice Implementation. My FLINK CEP program run perfectly fine in simple scala application. But when i use this code inside the Lagom service implementation i am receiving the following exception

enter image description here

Lagom Service Implementation

override def start =  ServiceCall[NotUsed, String] {

val env = StreamExecutionEnvironment.getExecutionEnvironment

var executionConfig = env.getConfig
env.setParallelism(1)
executionConfig.disableSysoutLogging()


var topic_name="topic_test"

var props= new Properties
props.put("bootstrap.servers", "localhost:9092")
props.put("acks","all");
props.put("key.serializer","org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer","org.apache.kafka.common.serialization.ByteArraySerializer");
props.put("block.on.buffer.full","false");

val kafkaSource = new FlinkKafkaConsumer010 (topic_name, new KafkaDeserializeSchema , props)

val stream =  env.addSource(kafkaSource)


val deliveryPattern = Pattern.begin[XYZ]("begin").where(_.ABC == 5)
  .next("next").where(_.ABC == 10).next("end").where(_.ABC==5)

val deliveryPatternStream = CEP.pattern(stream, deliveryPattern)

def selectFn(pattern : collection.mutable.Map[String, XYZ]): String  = {
  val startEvent = pattern.get("begin").get
  val nextEvent = pattern.get("next").get
  "Alert Detected"


}

val deliveryResult =deliveryPatternStream.select(selectFn(_)).print()

env.execute("CEP")
req=> Future.successful("Done")

 }



}

I don't understand how to resolve this issue.

Madiha Khalid
  • 414
  • 3
  • 15
  • Every field in your object should be serializable (often including constructor arguments). To localize problem you might comment out all components and uncomment parts of them. – Arseniy Zhizhelev Apr 23 '17 at 12:07
  • 1
    Yes, for Flink data processing I serialize my class and this same works perfectly fine in simple scala main() method however in Lagom Microservice implementation this exception apear. I think this exception is in Lagom framework, to integrate Flink's data processing as web service (in Lagom). – Madiha Khalid Apr 23 '17 at 19:29
  • @ignasi35 can you answer this problem? – Madiha Khalid Apr 24 '17 at 09:29
  • I don't know much about Flink but it looks like there's a `Task` (?) that is being serialized and it contains a `Lagom_cassServiceImpl` which seems to be a Lagom Service Implementation (which is not meant to be distributed/serialized). I think there's a conceptual error in your approach, or a pojo you are trying to send over Flink holds a reference to the Service that created it, or... :-( – ignasi35 Apr 25 '17 at 09:36

0 Answers0