1

I am creating streaming analytics application using Spark, Flink & Kafka. Each analytics/functionality will implement as a Microservice so that this analytics can able to use in the different project later.

I run my Spark/Flink job perfectly in Simple Scala application and submit this job over Spark & Flink cluster respectively. But I have to start/run this job when REST POST startJob() request invoke to my web service.

How can I integrate my Spark & Flink data processing functionality in a web service oriented application?

Till now I tried Lagom Microservice but i found so many issues you can check

  1. Best approach to ingest Streaming Data in Lagom Microservice
  2. java.io.NotSerializableException using Apache Flink with Lagom

I think i am not taking the right direction for Stream Processing Microservice Application. Looking for right direction to implement this analytics over REST Service.

Community
  • 1
  • 1
Madiha Khalid
  • 414
  • 3
  • 15

2 Answers2

0

Flink has a REST API you can use to submit and control jobs -- it's used by the Flink Web UI. See the docs here. See also this previous question.

Community
  • 1
  • 1
David Anderson
  • 39,434
  • 4
  • 33
  • 60
0

I think the REST API provides job running details, Any Flink API provides suppose if Spring Boot REST end point call connects Kafka streaming data, and returns Kafka data?

prostý člověk
  • 909
  • 11
  • 29