0

I'm trying to rework a hazelcast Jet 0.3 DAG system I wrote a few weeks ago into v0.4 as a first step of changing it from batch to stream. Interestingly all of a sudden I'm experiencing some weird behavior, where I can't be sure that vertices work as expected. Trying to get down on what's happening, I fail to find an option on how to peek into the the inner workings of each vertex. Is there a way to get at least some error messages out of them?

In an attempt to isolate the problem I tried to dumb it down to a very simplistic "read from list, map it to a map write to map" DAG. But still no success on getting anything out.

Below my dumbed down example, maybe I make a very simple mistake that someone more knowledgable will see right away?

Publisher:

// every second via executorservice:
final IStreamMap<Long, List<byte[]>> data = jet.getMap("data");
data.set(jet.getHazelcastInstance().getAtomicLong("key").getAndIncrement(), myByteArray);

Analyzer:

jet.getList(key.toString()).addAll((List<byte[]>) jet.getMap("data").get(key));
jet.getMap("data").remove(key);
logger.debug("List {} has size: {}", key, jet.getList(key.toString()).size());

final Vertex sourceDataMap = this.newVertex("sourceDataMap", readList(key.toString())).localParallelism(1);
final Vertex parseByteArrayToMap = this.newVertex("parseByteArrayToMap", map(
    (byte[] e) ->  new AbstractMap.SimpleEntry<>(jet.getHazelcastInstance().getAtomicLong("counter").getAndIncrement(), e)));
final Vertex sinkIntoResultMap = this.newVertex("sinkIntoResultMap", writeMap("result"));

this.edge(between(sourceDataMap, parseByteArrayToMap))
    .edge(between(parseByteArrayToMap,  sinkIntoResultMap));

Listener:

jet.getMap("result").addEntryListener((EntryAddedListener<Long, byte[]>)
       (EntryEvent<Long, byte[]> entryEvent)
           -> logger.debug("Got result: {} at {}",entryEvent.getValue().length, System.currentTimeMillis())
       , true);

The data generation works just fine, everything is ok till the DAG should take over... but no error messages or anything coming from the DAG. Any suggestions?

Anders Bernard
  • 541
  • 1
  • 6
  • 19
  • Can you show us an [MCVE](https://stackoverflow.com/help/mcve) that can demonstrate your problem? And you can also check the `DiagnosticProcessors` class that has the tools you describe. – Marko Topolnik Jun 30 '17 at 10:50
  • You have a lambda which refers to `jet`. The JetInstance ins't serializable, so can't be used as part of the lambda. If you want to generate unique IDs, you can use something like `UUID.randomUUID()`. – Can Gencer Jun 30 '17 at 11:13
  • I tried to execute a job from the DAG you posted and it throws an `IllegalArgumentException` at the line that creates the vertex Can mentions. So it fails fast (and loud), before even getting the opportunity to start a job. – Marko Topolnik Jun 30 '17 at 11:18
  • ...and after changing to `randomUUID()`, i get the entry listener's log output. – Marko Topolnik Jun 30 '17 at 11:33
  • OK that's weird, I didn't get that IllegalArgumentException. That's exactly what I'm looking for. Thank you. – Anders Bernard Jul 01 '17 at 05:18
  • And found the reason why I didn't get the error. Double executioner wrapping. Read runnable in runnable -> outer runnable eats inner errors. – Anders Bernard Jul 01 '17 at 05:27

1 Answers1

1

Here's your code slightly sanitized that works on my side:

public class Main {
    public static void main(String[] args) throws Exception {
        JetInstance jet = Jet.newJetInstance();
        try {
            HazelcastInstance hz = jet.getHazelcastInstance();
            ILogger logger = hz.getLoggingService().getLogger("a");

            // every second via executorservice:
            final IStreamMap<Long, List<byte[]>> data = jet.getMap("data");
            List<byte[]> myByteArray = asList(new byte[1], new byte[2]);
            IAtomicLong keyGen = hz.getAtomicLong("key");
            Long key = keyGen.getAndIncrement();
            data.set(key, myByteArray);

            String stringKey = key.toString();
            hz.getList(stringKey).addAll((List<byte[]>) jet.getMap("data").get(key));
            jet.getMap("data").remove(key);
            logger.severe(String.format("List %s has size: %d", key, jet.getList(stringKey).size()));

            hz.getMap("result").addEntryListener((EntryAddedListener<Long, byte[]>)
                    (EntryEvent<Long, byte[]> entryEvent) -> logger.severe(String.format(
                            "Got result: %d at %d", entryEvent.getValue().length, System.currentTimeMillis())),
                    true);

            DAG dag = new DAG();
            Vertex sourceDataMap = dag.newVertex("sourceDataMap", readList(stringKey)).localParallelism(1);
            Vertex parseByteArrayToMap = dag.newVertex("parseByteArrayToMap", map(
                    (byte[] e) -> entry(randomUUID(), e)));
            Vertex sinkIntoResultMap = dag.newVertex("sinkIntoResultMap", writeMap("result"));

            dag.edge(between(sourceDataMap, parseByteArrayToMap))
               .edge(between(parseByteArrayToMap, sinkIntoResultMap));

            jet.newJob(dag).execute().get();
            Thread.sleep(1000);
        } finally {
            Jet.shutdownAll();
        }
    }
}

In the console i see:

SEVERE: [192.168.5.12]:5701 [jet] [0.4-SNAPSHOT] [3.8.2] List 0 has size: 2
SEVERE: [192.168.5.12]:5701 [jet] [0.4-SNAPSHOT] [3.8.2] Got result: 2 at 1498822322228
SEVERE: [192.168.5.12]:5701 [jet] [0.4-SNAPSHOT] [3.8.2] Got result: 1 at 1498822322228
Marko Topolnik
  • 195,646
  • 29
  • 319
  • 436