0

I'm new to Azure Function App.

I have my python code that I want to run when http trigger called.

I have new project and calling in "__ init __.py" What is the correct way to call my code?

Here is "__ init __.py":

import logging
import azure.functions as func
import UploadToGCS


def main(req: func.HttpRequest) -> func.HttpResponse:
    logging.info('Python HTTP trigger function processed a request.')

    name = req.params.get('name')
    if not name:
        try:
            req_body = req.get_json()
        except ValueError:
            pass
        else:
            name = req_body.get('name')

    if name:
        UploadToGCS(UploadToGCS.upload_files)     <--- I called it here
        return func.HttpResponse(f"Hello, {name}. This HTTP triggered function executed successfully.")
    else:
        return func.HttpResponse(
             "This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.",
             status_code=200
        )

Currently I receive "401 error" page

Can you please, suggest how it should be done?

Here is my python code: (I'm uploading file to Google Cloud Storage bucket using details in config_file = find("gcs_config.json", "C:/")):

from google.cloud import storage
import os
import glob
import json

# Finding path to config file that is called "gcs_config.json" in directory C:/
def find(name, path):
    for root, dirs, files in os.walk(path):
        if name in files:
            return os.path.join(root, name)

def upload_files(config_file):
    # Reading 3 Parameters for upload from JSON file
    with open(config_file, "r") as file:
        contents = json.loads(file.read())
        print(contents)

    # Setting up login credentials
    os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = contents['login_credentials']
    # The ID of GCS bucket
    bucket_name = contents['bucket_name']
    # Setting path to files
    LOCAL_PATH = contents['folder_from']

    for source_file_name in glob.glob(LOCAL_PATH + '/**'):

    # For multiple files upload
    # Setting destination folder according to file name 
        if os.path.isfile(source_file_name):
            partitioned_file_name = os.path.split(source_file_name)[-1].partition("-")
            file_type_name = partitioned_file_name[0]

            # Setting folder where files will be uploaded
            destination_blob_name = file_type_name + "/" + os.path.split(source_file_name)[-1]

            # Setting up required variables for GCS 
            storage_client = storage.Client()
            bucket = storage_client.bucket(bucket_name)
            blob = bucket.blob(destination_blob_name)

            # Running upload and printing confirmation message
            blob.upload_from_filename(source_file_name)
            print("File from {} uploaded to {} in bucket {}.".format(
                source_file_name, destination_blob_name, bucket_name
            ))

config_file = find("gcs_config.json", "C:/")

upload_files(config_file)

Kind regards, Anna

AnnaSh
  • 11
  • 1
  • 4

1 Answers1

0

I'm replying to this as no one else did, and someone might stumble upon this thread looking for an answer.

To run your function locally from VS Code:

  1. Initiate the function in your local environment, run the command in terminal inside vscode:

    func init

This will create all the necessary files in your folder and a virtual environments (if you're using Anaconda, you need to configure the setting.json for vscode that points to the Conda environments).

  1. Finish your init.py file. Then start the function with:

    func start

The function will deploy at localhost and give you a link.

  1. If you want to deploy it in the cloud, install Azure Function extension, you'll have option to login and select your subscription. After this is done, you can deploy the functions under any Function App that was created in Azure.