I am using a custom dropwizard metrics source and reporter. Everything works great when I invoke metrics directly but not through metrics-annotations.
In the code below calls to testTimer works without requiring to register the source with Spark Metrics (my custom source is an object which has a call to register method at the end of it which takes care of initialization as needed)
But unfortunately this is not working (as expected) for metrics annotations without manually registering the source for each executor (Metrics system doesn't know of my custom source/reporter, we need to inform as such).
In the example below (it works by the way) I am using sc.parrallelige but in real world applications we can't use this approach (especially when spark.dynamicAllocation.enabled is used), instead we need to register a spark listener (addSparkListener then override onExecutorAdded).
Given this context, how could I tell spark to invoke my register() upon adding an executor? (and unregister onExecutorRemoved)
object MetricsTestApp extends App {
System.setProperty("configName", "testEpa")
val conf = new SparkConf().setAppName("Test") //.setMaster("local[*]")
val sparkContext = new SparkContext(conf) // .getOrCreate(conf)
sparkContext.setLogLevel("WARN")
val total = sparkContext.parallelize(1 to 3)
.map(i => {
// testTimer <-- This works fine
LashmMetrics.register()
Thread.sleep(5000)
assert(Hello.SayHello == 2)
Thread.sleep(4000)
assert(Hello.SayHello == 2)
})
.count
Thread.sleep(20000)
println(s"Total $total")
object Hello extends Serializable {
@Timed(name="TestAnnotationTimed")
def SayHello: Int = {
println("testing....")
Thread.sleep(1000)
2
}
}
def testTimer {
Thread.sleep(500)
timer("TimerFromTestApp").time{
println("sleepy..")
Thread.sleep(1000)
println("done")
}
Thread.sleep(500)
}
}