0

Hi i've been trying to execute a custom activity in ADF which receives csv file from the container (A) after further transformation on the data set, transformed DF stored into another csv file in a same container (A). I've written the transformation logic in python and have it stored in the same container (A). Error raises here, when i execute the pipeline it returns an error *can't find the specified file * Nothing wrong in the connections, Is anything wrong in batch Account or pools!! can anyone tell me where to place the python script..!!!

  • could you please share the exact error screenshot? you are reading the file, doing some transformation and again saving the file? It might possible the filename you have mentioned in the script has been changed after transformation. if possible, also share code part. – Utkarsh Pal Dec 30 '21 at 08:54
  • I recommend that you use Databricks for Python code. You can easily call a databricks python script from Data factory to do your mutations. In Databricks you can mount a datalake/storage account, so you can easily access your csv file. – 54m Dec 30 '21 at 09:47
  • Error code: 2500 {"errorCategory":0,"code":"CommandProgramNotFound","message":"The specified command program is not found","details":[{"Name":"Message","Value":"The system cannot find the file specified."} code is attached below in the link, couldn't type the code as it was out of character limits https://learn.microsoft.com/en-us/azure/batch/tutorial-run-python-batch-azure-data-factory – Saravana Kumar Dec 30 '21 at 13:55
  • Please provide enough code so others can better understand or reproduce the problem. – Community Jan 09 '22 at 09:39

1 Answers1

-1

Install azure batch explorer and make sure to choose proper configuration for virtual machine (dsvm-windows) which will ensure python is already in place in the virtual machine where your code is being run.

This video explains the steps

https://youtu.be/_3_eiHX3RKE

All About BI
  • 493
  • 3
  • 6