I have a java process that implements Runnable and is used to subscribe/pull to/from a zeromq publisher/pusher, collect the data for a while and once a certain amount or records is reached inserts them into a database. Everything works fine but when the publisher/pusher stops working the process handling in and collecting data immediately dies. Once the publisher restarts everything works fine but the records that were collected in memory aren't inserted because the process is immediately stopped. This results in loss of data on publisher restarts which is my issue.
I've already tried checking for thread interruption, surrounding the code with various try/catch statements etc. I'm really lost in regards to what other solutions I can try to implement, any suggestions?
Here is the relevant part of the code:
String address = "tcp://" + meta.getExporterAddress() + ":" + meta.getExporterPort();
ZMQ.Context context = ZMQ.context(10);
ZMQ.Socket socket;
if (exporterMode.toUpperCase().equals("STREAMER")) {
socket = context.socket(ZMQ.PULL);
socket.connect(address);
this.log.info("connected ZMQJob as PULL: " + exporterMode);
} else {
// BROKER
socket = context.socket(ZMQ.SUB);
socket.connect(address);
socket.subscribe(ZMQ.SUBSCRIPTION_ALL);
this.log.info("connected ZMQJob as SUB: " + exporterMode);
}
this.log.info("Started ZMQJob for " + meta.getTargetTableName());
long startTime = System.currentTimeMillis();
while (!Thread.currentThread().isInterrupted()) {
jsonString = Snappy.uncompress(socket.recv(0));
BufferedReader bufReader = new BufferedReader(new StringReader(new String(jsonString)));
while ((line = bufReader.readLine()) != null) {
jsonLines.add(line);
}
if (jsonLines.size() >= threshold || System.currentTimeMillis() > startTime + timeout * 1000) {
//db related code used for inserting here
}
}
I imagine maybe using a different condition for the while loop might be a solution but I'm not sure what condition.