0

I have a Python Azure function which normally do the etl process. My function also includes the downloading of files from API to Temp directory and Uploading the files to container. I am getting the following error [Errno 28] No space left on device, I had tried to check every possible place since it is a space issue, I think i have enough space in my storage account and also i had restarted my func-app to clear out my temp directory.

James Lin
  • 153
  • 2
  • 10
  • Please include your code that you're using to download files from the api to your temp dir.. Are you certain you have enough disk space? – ewokx May 26 '22 at 04:12

2 Answers2

0

Azure Functions have a limit of 5GB data per session max. So even if your storage account can take unlimited data, azure function will not be to handle such huge data at a time. So most probably this error comes from function itself.

sputnik
  • 38
  • 1
  • 6
  • How do you calculate this 5 gb data, Are you saying downloading of 5gb data? – James Lin May 26 '22 at 04:42
  • Your answer could be improved with additional supporting information. Please [edit] to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community May 26 '22 at 06:07
  • When I ran an azure function to download data from blob into function's own storage for data processing, I got this error. – sputnik May 26 '22 at 07:43
  • If you trigger function multiple times in high frequency, this error can occur as the aggregated size of the data that is being stored in function storage will be higher – sputnik May 26 '22 at 07:44
  • What is the solution for this, What needs to be done to avoid the space error – James Lin May 26 '22 at 10:51
  • Ensure that at any point of time Azure Function is not handling huge data - if it cannot be avoided then create multiple functions in function app and use them parallelly. But I would suggest you to use spark in Azure Databricks. Its designed to handle such huge data flows and associated transformations – sputnik May 27 '22 at 09:39
0

Microsoft-hosted agents provide 10 GB of storage for your source and build outputs- here.

If your source code and build outputs are larger than 10GB, I recommend using the Self Hosted Agent to complete your build. You can try emptying the cache with the below command once to see if that helps with the python storage issue. --> pip install --no-cache-dir tensorflow-gpu.

The maximum data per session for Azure Functions is 5GB. Even if your storage account allows for unlimited data, the Azure function will be unable to manage large amounts of data. As a result, this error is most likely caused by a function.

As a temporary fix, you can scale up your app service plan and the again scale down to original and it has to clean space and start working.

Further refer ref1,ref2 for more information.

RithwikBojja
  • 5,069
  • 2
  • 3
  • 7
  • As u suggested, I wanted to try clearing out cache data but i am confused, where to run this command " pip install --no-cache-dir tensorflow-gpu" – James Lin May 26 '22 at 05:02