I was trying to run a java function in pyspark using py4j. Py4j enables accessing java objects in a JVM. I created another instance of a JVM and was able run the java function successfully.
py4j enables this communication via GatewayServer instance.
I was wondering if we could somehow access spark's internal JVM to run my java function? What is the entry point for the py4j Gatewayserver in spark ? How can I add my function to the entry point ?