0

I am not able to view the spark-ui for databricks jobs executed through notebook activity in Azure datafactory.

Does anyone know which permissions needs to be added to enable the same?

enter image description here

mehere
  • 1,487
  • 5
  • 28
  • 50

1 Answers1

0

--Update

Ensure you have Cluster-level permissions on PROD env

enter image description here

enter image description here

enter image description here

Job permissions

There are five permission levels for jobs: No Permissions, Can View, Can Manage Run, Is Owner, and Can Manage. Admins are granted the Can Manage permission by default, and they can assign that permission to non-admin users.

enter image description here

enter image description here

enter image description here

enter image description here

Next...

Am able to view the completed jobs in the Spark UI without any additional setup.

enter image description here

enter image description here

It maybe that the stuff you are doing in the Notebook, does not constitute for a Spark Job.

Refer: Web UI and Monitoring and Instrumentation for more details.

Checkout Spark Glossary

Note: Job A parallel computation consisting of multiple tasks that gets spawned in response to a Spark action (e.g. save, collect); you'll see this term used in the driver's logs.

View cluster information in the Apache Spark UI

You can get details about active and terminated clusters. If you restart a terminated cluster, the Spark UI displays information for the restarted cluster, not the historical information for the terminated cluster.

So, If I run a notebook with a task to simply display my mounts or with a task that failed or exception, it is not listed in Spark UI.

enter image description here

enter image description here

#job/83 is not seen

enter image description here

KarthikBhyresh-MT
  • 4,560
  • 2
  • 5
  • 12
  • this is disabled for me in prod due to some permissions. We can view the same in dev env. – mehere Nov 10 '21 at 17:25
  • Can you share snips of both and access you have on them now. Are you able to view Spark UI in cluster but not jobs listed ? – KarthikBhyresh-MT Nov 11 '21 at 04:39
  • Have added the screenshot to question. If access in databricks for cluster is given, then we are able to view the same. we dont need complete access. Is there any specific permission / action that can be assigned to the Azure role to view the same. – mehere Nov 11 '21 at 05:19
  • oki, you would have to ask the admin in prod to configure job permissions. Since you seem to already have cluster permission – KarthikBhyresh-MT Nov 11 '21 at 06:22