0

Currently I'm working on a semester project where I have to recognize the series of three Events. Like P -> R -> P

We have two different event types which are consumed via a Kafka connector in the same topic.

I created a parent class called Event from which the other two types derive from.

The Kafka connector deserializes the JSON with the EventSchema, to the parent class Event.

val consumer = new FlinkKafkaConsumer("events", new EventSchema, properties)
val stream = env.addSource(consumer)

The pattern looks like this:

val pattern = Pattern
  .begin[Event]("before")
  .subtype(classOf[Position])
  .next("recognized")
  .subtype(classOf[Recognized])
  .next("after")
  .subtype(classOf[Position])

The current problem is, that if I send three messages with the appropriate format, the pattern will not be recognized.

What I tried else.. I changed the pattern like this:

val pattern = Pattern
  .begin[Event]("before")
  .where(e => e.getType == "position")
  .next("recognized")
  .where(e => e.getType == "recognition")
  .next("after")
  .where(e => e.getType == "position")

This pattern works, but later I can't cast the Event class to position or recognition..

What do I miss here?

Daniel Eisenreich
  • 1,353
  • 3
  • 16
  • 32
  • Maybe the elements that you pass to the pattern are Events ? – Jiayi Liao Jan 17 '19 at 02:47
  • That’s right, but is it not possible to have different types of events, order them ascending from event time and find a pattern inside? If all events come from one topic or each event got its own topic should not make a point.. – Daniel Eisenreich Jan 17 '19 at 09:24
  • 1
    Did you initialize the objects with the subtype when deserializing from kafka? – Jiayi Liao Jan 17 '19 at 09:30
  • I just serialize it as event with `val kafkaSource = new FlinkKafkaConsumer("sp", new EventSchema, properties)` because to runtime multiple types are in one topic.. But can I combine multiple kafkaSources with different types to one? – Daniel Eisenreich Jan 17 '19 at 11:42
  • Can you put the codes of EventSchema here? I've tried according to your descriptions, but it works. – Jiayi Liao Jan 17 '19 at 13:38
  • `class EventSchema extends AbstractDeserializationSchema[Event] { val mapper = new ObjectMapper() override def deserialize(bytes: Array[Byte]): Event = mapper.readValue(bytes, classOf[Event]) }` – Daniel Eisenreich Jan 17 '19 at 13:40
  • @bupt_ljy can you post your code else where so I can verify which part is different? – Daniel Eisenreich Jan 17 '19 at 13:54

1 Answers1

1

According to the comments, I think you should return the subtype instances instead of the Event. Here is my example codes for you:

val event = mapper.readValue(bytes, classOf[Event])
event.getType match {
  case "position" => mapper.readValue(bytes, classOf[Position])
  case "recognition" => mapper.readValue(bytes, classOf[Recognized])
  case _ =>
}

I successfully tried the example from a test case in CEPITCase.java.

DataStream<Event> input = env.fromElements(
  new Event(1, "foo", 4.0),
  new SubEvent(2, "foo", 4.0, 1.0),
  new SubEvent(3, "foo", 4.0, 1.0),
  new SubEvent(4, "foo", 4.0, 1.0),
  new Event(5, "middle", 5.0)
);

Pattern<Event, ?> pattern = Pattern.<Event>begin("start").subtype(SubEvent.class)
.followedByAny("middle").subtype(SubEvent.class)
.followedByAny("end").subtype(SubEvent.class);
Jiayi Liao
  • 999
  • 4
  • 15
  • You're a genius! Just one thing what didn't work of your example.. When I use `asInstanceOf` an Exception occurs that I can't cast Event to Position.. Because of this I exchanged this with `mapper.readValue(bytes, classOf[Position])` and just redeserialized it. If you could edit this in your post, I will accept is as the answer! Again, thank you! – Daniel Eisenreich Jan 18 '19 at 08:35
  • 1
    @DanielEisenreich Is this the change you need? – Jiayi Liao Jan 18 '19 at 08:46