2

I keep getting following error from Data Factory whenever I run an U-SQL Job

Job submission failed, the user 'adla account name' does not have permissions to a subfolder in the /system/ path needed by Data Lake Analytics. Please run “Add User Wizard” from the Data Lake Analytics Azure Portal or use Azure PowerShell to grant access for the user to the /system/ and its children on the Data Lake Store.

And I am not using any Firewall as suggested in this post:

Run U-SQL Script from C# code with Azure Data Factory

I am using the Azure Data Lake Store service principal authentication. When I start the Job from Visual Studio, it also works fine.

I would be grateful for any idea...

thanks

Muhammad Shaharyar
  • 1,044
  • 1
  • 8
  • 23
candidson
  • 516
  • 3
  • 18
  • Have you explicitly given the account permissions to azure data lake store? see step 2 of http://spr.com/azure-data-lake-store-add-service-to-service-authentication/ – Peter Bons Jul 05 '17 at 15:42
  • Yes I did that. The Active Directory Application has access to the adls root folder and all its children... – candidson Jul 05 '17 at 21:46

2 Answers2

1

If you are authorising the Azure Data Lake Analytics linked service from Azure Data Factory with a service principal that could be your problem.

I have an outstanding support ticket with Microsoft because the service principal authentication method only works with simple data factory activities like 'copy'. It does not work if you want to authenticate complex activities like 'DotNotActivity'.

My advise would be to change the linked service back to using session and token auth, then deploy your activities and try again.

Hope this helps.

Paul Andrew
  • 3,233
  • 2
  • 17
  • 37
  • 1
    Thank you @Paul Andrew for your input. I was able to solve the problem. Actually, the Azure Active Directory App Account I was using really permissions on the "system" and on the "catalog" folders. Which is really weird... However I will keep your hint in mind. THanks a lot! ANd btw, your blogs rock! – candidson Jul 07 '17 at 14:02
0

It does sound like a permissions problem. You can run this powershell script to ensure you have applied the proper permissions to the security principal:

Login-AzureRmAccount
$appname = “adla”
$dataLakeStoreName = “yourdatalakename”

$app = Get-AzureRmADApplication -DisplayName $appname

$servicePrincipal = Get-AzureRmADServicePrincipal  -SearchString $appname

Set-AzureRmDataLakeStoreItemAclEntry -AccountName $dataLakeStoreName -Path / -AceType User -Id $servicePrincipal.Id -Permissions All

Set-AzureRmDataLakeStoreItemAclEntry -AccountName $dataLakeStoreName -Path /system -AceType User -Id $servicePrincipal.Id -Permissions All

If you want to create everything from scratch using a powershell script, here is a blog that will help you:

http://dyor.com/setting-up-an-azure-data-lake-and-azure-data-factory-using-powershell/

mattdyor
  • 16
  • 2
  • Thanks @mattyor... The security principal has permission to the root folder of ADLS + all its children. Also, it is part of the default permissions... – candidson Jul 06 '17 at 07:47
  • 1
    I double checked this. I did assign permissions to the root folder and all its children. However, the system and the catalog folders did not get any permissions. I had to reassign permissions to these folders manually. Surely the script helped. Thank – candidson Jul 07 '17 at 14:01
  • 1
    Excellent! I find Azure Data Lake Store permissions fail way too silently. I spent a few hours with bad permissions. If the portal said "Security Principal X is trying to access folder Y, but does not have permissions" it would have saved me a ton of time. But, if you just use powershell to create the SP (what I do in the referenced blog), that also works. Thank you. – mattdyor Jul 10 '17 at 22:02