0

I created "Trigger Azure Functions on blob containers using an event subscription" with visual studio code and I am running it in local, For now I want to read netCDF files. normally I am doing this by :

from netCDF4 import Dataset
     nc_file ='path of .nc file'
     nc = Dataset(nc_file, mode='r')

but now I dont know how to find the path of my file in the container. My init file in the azure function is like this:

import logging 
import azure.functions as func
def main(myblob: func.InputStream):
    logging.info(f"Python blob trigger function processed blob \n"
                 f"Name: {myblob.name}\n"
                 f"Blob Size: {myblob.length} bytes")

Thank you in advance for your time and concern.

Elmira
  • 1

2 Answers2

0

but now I dont know how to find the path of my file in the container

Since you are using blob trigger, you just need to upload a file to the container which you have mentioned in your Azure function's function.json path. Below is my function.json where the function could able to read the file when uploaded to the container named "container1".

{
  "scriptFile": "__init__.py",
  "bindings": [
    {
      "name": "myblob",
      "type": "blobTrigger",
      "direction": "in",
      "path": "container1/{name}", /*This is the line that defines the path*/
      "connection": "AzureWebJobsStorage"
    }
  ]
}

RESULTS:

enter image description here

SwethaKandikonda
  • 7,513
  • 2
  • 4
  • 18
  • First of all thanks a lot for your reply, I have json file I can see that name when I upload the file but I can not open it with "netCDF4" library. and I dont know any other way to read .nc files. When I give "container1/test.nc" it says: Exception: OSError: [Errno -90] NetCDF: file not found – Elmira Oct 19 '22 at 10:59
0

I figured it out thanks to answer of DopplerShift and I am writing it here for the ones who may need it:

from netCDF4 import Dataset
fobj = open('path/to/netcdf.nc', 'rb')
data = fobj.read()
nc = Dataset('memory', memory=data)
Elmira
  • 1