5

I'm using Apache Spark and the metrics UI (found on 4040) is very useful.

I wonder if it's possible to add custom metrics in this UI, custom task metrics but maybe custom RDD metrics too. (like executing time just for a RDD transformation )

It could be nice to have custom metrics grouped by stream batch jobs and tasks.

I have seen the TaskMetrics object but it's marked as a dev api and it looks just useful for input or output sources and do not support custom values.

There is spark way to do that ? Or an alternative?

crak
  • 1,635
  • 2
  • 17
  • 33

1 Answers1

0

You could use the shared variables support [1] built-in in Spark. I often used them for implementing something like that.

[1] http://spark.apache.org/docs/latest/programming-guide.html#shared-variables

Andrea
  • 2,714
  • 3
  • 27
  • 38