1

I am trying to learn about the capabilities available in Azure to run several command line applications in Azure as part of an Azure Web App. The command line applications I want to run are basically .exe apps which also need some dependency files that are required to be there (*.dll) by the time the .exe runs.

I want to write an Azure Web App and part of the logic of the Web Application would be to run the .exe and store the output somewhere else (I am not really concerned about that part yet). However, by searching on the Internet about Azure Storage, mounting/dismounting drives, etc... I got totally confused.

With that being said, I am struggling to figure out how the design of my Web Application will look like using Azure technologies:

  1. What kind of Azure Storage is recommended in this case? (File Storage? Blob Storage?)
  2. In order to access to the different .exe files and all its dependencies, I guess I would need to somehow mount the storage on the web app at run time so I can refer to the path where the .exe is located at. Is that even possible using an Azure Web App since those are Platform as a Service and we don't have control of the machine where code is executed?
  3. Also, since I have three different .exe command line applications (with their dependency files -dlls-) , what is the best way to organize these in Azure Storage?

There are some fundamentals that I am missing and will be great to hear your opinions in this matter as I have developed serveral Azure Web Apps but never integrated Azure Storage in them.

EDIT: Just to clarify, the number of command line tools that I want to run can vary. I will eventually have different versions of the same command line tool and still have a way to run any version hence I thought having an Azure Storage where I can upload my different versions is somehow required.

user3587624
  • 1,427
  • 5
  • 29
  • 60

1 Answers1

1

Azure App Service has its own storage. That's where you deploy your .exe and .DLLs (preferably with Git, but webdeploy/zipdeploy/FTP are always options).

You invoke it just like you do in any web application (.NET/PHP/whatever stack). Nothing special here. Write to %TEMP% if temporary storage is required.

You should still use Blob storage for your output, if said output looks like a binary blob, else use more suitable storage mechanics (SQL/NoSQL database - Azure SQL and Cosmos DB fit here).

You could probably do away with an Azure Function if you don't really need a full blown Web App.

evilSnobu
  • 24,582
  • 8
  • 41
  • 71
  • Maybe I need to edit my question and add some details (apologies for that!) but the thing is that the number of applications (.exe) we will have to run won't be a fixed number. They will change overtime and that's why I think will make sense to have them in Azure Storage. That will give me the flexibility to add new command line applications (or versions of the same application) and at runtime simply pick one of them, mount it and run it. Let me know if that is worth to edit my initial post, please :) – user3587624 Mar 19 '18 at 22:16
  • 1
    Sure, you could have those binaries in Blob storage and pass them in with a query parameter or POST body. Your code still needs to download them somewhere in `%TEMP%` first before the sandbox can run them. You can't run binaries just by referencing a Blob storage URL. But if you treat your _let's add one more binary_ scenario as a new deployment it makes more sense to deploy alongside your main Web App/Function App. – evilSnobu Mar 19 '18 at 22:19
  • Will Blob storage be the right storage to pick vs. let's say File Storage? In my case the tool contains .exe, dll files, config files. Also, by the nature of an Azure Web App (PaaS) once they get downloaded, it doesn't mean I can reuse them again when a new request hits my service, right? In that case I would need to download the command line app again from the storage. Is that correct? – user3587624 Mar 19 '18 at 22:21
  • 1
    Once downloaded you can reference them locally, many times. In App Service, `d:\home` is persistent storage, while `d:\local` is ephemeral, gets wiped on each app restart. More here - https://github.com/projectkudu/kudu/wiki/Understanding-the-Azure-App-Service-file-system. Use Blob not File, you won't be able to mount a File storage endpoint in App Service due to sandbox limitations - https://github.com/projectkudu/kudu/wiki/Azure-Web-App-sandbox#restricted-outgoing-ports. Also don't hardcode paths, use the corresponding env vars, which you'll find in Kudu, on the Environment page. – evilSnobu Mar 19 '18 at 22:24
  • That sounds pretty cool then! Seems like I would be able to keep that location available as "static" location and then have as many versions of my different command line applications available to me no matter how many machines are spin. I am marking your answer as the right one! – user3587624 Mar 19 '18 at 22:26
  • 1
    You can treat `d:\local` as a cache. If it's not there, pull from Blob storage, then invoke from `d:\local`. Keep it simple. No need for special _on start_ logic. – evilSnobu Mar 19 '18 at 22:29