0

All I want to do, is to upload files from on prime to Azure Data Lake Storage Gen2 using the Azure CLI (via ` command), but have a connection error! Can I use Azure CLI to to that? Or I have to use another tool? PS: I cannot use Azure Data Factory, I want my job running from my on prime and not from the cloud! Thks.

azure.datalake.store.exceptions.DatalakeRESTException:  HTTP error: 
ConnectionError(MaxRetryError("HTTPSConnectionPool(host='storageAccount.azuredatalakestore.net', port=443): 
Max retries exceeded with url: /webhdfs/v1/my-file-system/data.csv?OP=GETFILESTATUS&api-version=2018-05-01 
(Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7fc7ed169c50>: 
Failed to establish a new connection: [Errno -2] Name or service not known')
Joy Wang
  • 39,905
  • 3
  • 30
  • 54
benabderrahmane
  • 55
  • 1
  • 1
  • 8

2 Answers2

3

No, Azure CLI for gen2 filesystem is not available, see this link.

enter image description here

If you want to upload file, here are two workarounds for you to refer.

  1. Use Azure Storage Explorer

  2. Use AzCopy v10, note only v10 supports Azure Data Lake Storage Gen2 APIs. Use myaccount.dfs.core.windows.net as a URI to call the ADLS Gen2 APIs.

Joy Wang
  • 39,905
  • 3
  • 30
  • 54
  • Hello, and thank you for your answer, I still have error while copying my files to my Data Lake Storage Gen2 using the AZCopy V10, bellow the command i use, and the error i got: `azcopy copy "path\to\folder\*" "https://myaccount.dfs.core.windows.net/my-container/" --overwrite=false --follow-symlinks --recursive --fromTo=LocalBlobFS` Error : `403 This request is not authorized to perform this operation using this permission` – benabderrahmane Feb 18 '19 at 09:34
  • In the documentation, i found that i can use the SAS Token, but when doing so, got another error : `400 Authentication information is not given in the correct format. Check the value of Authorization header` – benabderrahmane Feb 18 '19 at 09:40
  • The command i am using : `azcopy copy "path\to\folder\*" "https://myaccount.dfs.core.windows.net/my-container?SAS-TOKEN" --overwrite=false --follow-symlinks --recursive --fromTo=LocalBlobFS` – benabderrahmane Feb 18 '19 at 09:45
  • Once I added my user to Storage Blob Contributor, it worked fine. Wait a few minutes and re-login. – Emeria Aug 26 '19 at 20:00
1

Got it work ^^ So the problem was with my authentication method, to make it work, you have to add your user as a Data Lake Storage Contributor + Owner. For anyone looking for the Role in the UI it's called "Storage Blob Data Contributor (Preview)". For a Resource Group choose Access Control (IAM) | Add in the blade locate the role Storage Blob Data Contributor (Preview) and assign access to the Users, Groups or Roles as meets your needs.

benabderrahmane
  • 55
  • 1
  • 1
  • 8
  • 1
    You used AzCopy? If so, I think you could accept my reply as the answer, because it is my solution. – Joy Wang Feb 19 '19 at 00:49