1

Hi I am planning to use flink as a backend for my feature where we will show a UI to user to graphically create event patterns for eg: Multiple login failures from the same Ip address.

We will create the flink pattern programmatically using the given criteria by the user in the UI.

Is there any documentation on how to dynamically create the jar file and dynamically submit the job with it to flink cluster?

Is there any best practice for this kind of use case using apache flink?

JDForLife
  • 91
  • 2
  • 10

2 Answers2

2

The other way you can achieve that is that you can have one jar which contains something like an “interpreter” and you will pass to it the definition of your patterns in some format (e.g. json). After that “interpreter” translates this json to Flink’s operators. It is done in such a way in https://github.com/TouK/nussknacker/ Flink’s based execution engine. If you use such an approach you will need to handle redeployment of new definition in your own application.

0

One straightforward way to achieve this would be to generate a SQL script for each pattern (using MATCH_RECOGNIZE) and then use Ververica Platform's REST API to deploy and manage those scripts: https://docs.ververica.com/user_guide/application_operations/deployments/artifacts.html?highlight=sql#sql-script-artifacts

Flink doesn't provide tooling for automating the creation of JAR files, or submitting them. That's the sort of thing you might use a CI/CD pipeline to do (e.g., github actions).

Disclaimer: I work for Ververica.

David Anderson
  • 39,434
  • 4
  • 33
  • 60
  • 1
    Can you edit this answer to acknowledge your conflict of interest as it seems like you work for Ververica? – Ben Longo Mar 16 '22 at 01:59