We have a huge existing application in php which
- Accepts a log file
- Initialises all the database, in-memory store resources
- Processes every line
- Creates a set of output files
Above process happens per input file. Input files are written by a kafka consumer. Is it possible to fit this application in spark streaming by somehow not porting all the code in java? For example in following manner
- get a message from kafka topic
- Pass this message to spark streaming
- Spark streaming somehow interacts with legacy app and generates output
- spark then writes output again in kafka
Whatever I have just mentioned is too high level. I just want to know whether there's a possibility of doing this by not recoding existing app in java? And can anyone please tell me roughly how this can be done?