14

I'm running a single Azure Function on the consumption plan. I've picked the consumption plan for the serverless feature as well as to minimize cost. The function consumes messages from a service bus topic and writes some output to blob storage.

Keeping the function running for the last 30 days is around $10. That's very acceptable, since the function has a lot of messages to consume. Writing the output to blob storage is around $20. Also acceptable. What I don't understand is, that the charge for the Function's underlying storage account is around $70 for the same period. The consumption is primarily hitting File Write Operation Units and File Protocol Operation Units. The storage account is created as locally redundant general purpose v1.

Anyone able to explain what's going on here? When looking at the storage account, there's a few blobs. I believe the problem is with tables storage. When inspecting the storage account, there are tables looking like this:

$MetricsCapacityBlob $MetricsHourPrimaryTransactionBlob AzureWebJobsHostLogs201804

I've disabled logging in my function, by removing the AzureWebJobsDashboard app setting. After doing so, the AzureWebJobsHostLogs* tables no longer seems to receive new rows. But the $Metrics* tables still receive new data. I have no clue if writes to these tables are causing all of the file write activity I see in the Costs Management view in the Portal, though.

What's going on here? Is maintaining these tables from serverless code really required and does it sound normal that the price for table access is x7 the price of the function itself?

ThomasArdal
  • 4,999
  • 4
  • 33
  • 73
  • Please check the account kind for this storage account? If it is `General Purpose Storage V2`, then the pricing for such accounts is significantly higher than your V1 accounts. – Gaurav Mantri Apr 17 '18 at 06:29
  • It's Locally-redundant general purpose v1 storage. – ThomasArdal Apr 17 '18 at 06:32
  • Do you mean that you are seeing **additional** entries getting created in Table Storage? If all you are seeing as the ones that existed before you made the change, then I don't understand the question. – David Ebbo Apr 17 '18 at 19:12
  • I didn't make any change. The function has always been running on the consumption plan. I just didn't look at the cost until now. I totally understand that there is a cost related to maintain the storage, but I didn't expect it to be that expensive compared to running the function itself. – ThomasArdal Apr 17 '18 at 19:40
  • How many times does your function run per month? – kamil-mrzyglod Apr 18 '18 at 06:33
  • @Kamo I'm not sure about the exact number, but more than 5,000,000 times at least. – ThomasArdal Apr 18 '18 at 06:34
  • Are you 100% sure it's related to Azure Functions, not your blob storage? I've just checked my functions and none has File Write Operation Units/File Protocol Operation Units operations listed when it comes to attached storage pricing :( – kamil-mrzyglod Apr 18 '18 at 06:43
  • @Kamo I'm not sure. Just checked the tables named AzureWebJobsHostLogs* and they do in fact doesn't contain new data since I disabled logging. Now, I'm looking at a set of other tables named $MetricsCapacity* and $MetricHourPrimary*. I think it's very hard to see which files/tables that are causing the costs, though :( – ThomasArdal Apr 18 '18 at 07:20
  • The question has been updated to reflect my recent findings. – ThomasArdal Apr 18 '18 at 07:32
  • IMHO this cost is generated by something else(since I've never seen mentioned operations as a part of Functions' underlying storage). Could you switch used Storage Account to a fresh one and then check whether you're still getting them? – kamil-mrzyglod Apr 18 '18 at 07:53
  • @kamo Think that's a good idea. Will try that and publish the results here. Thanks! – ThomasArdal Apr 18 '18 at 09:18
  • @ThomasArdal Any update on that issue? – kamil-mrzyglod Apr 19 '18 at 13:43
  • @kamo I switched yesterday, so the recent cost report shows still show both storage accounts. Will evaluate in a few days. – ThomasArdal Apr 19 '18 at 14:40
  • @kamo Just followed up. The new storage account is now consuming the same as before. I noticed a setting on my function named "AzureWebJobsSecretStorageType". The value is set to "Blob". As I can see from the documentation, the default value is "disabled". I wonder if this could cause a lot of writes. – ThomasArdal Apr 20 '18 at 05:07
  • @ThomasArdal Is there any possiblity to provide e.g. ARM template or any way to reproduce your setup? I'd like to check this personally. – kamil-mrzyglod Apr 20 '18 at 06:32
  • @kamo Setting `AzureWebJobsSecretStorageType` to `disabled` didn't change anything. How do I extract the ARM template? Are you a MS employee or are you just there to help? Thinking about creating an issue through the Portal. – ThomasArdal Apr 22 '18 at 17:57
  • You can do it via e.g. `Automation Script` in Azure Portal. I guess it'd better to just post an issue on GitHub repo. – kamil-mrzyglod Apr 22 '18 at 18:24

2 Answers2

3

You should go to Metrics in Azure Portal for this storage account and check the patterns of how the File storage transactions are consumed. If it's consistently high, it's something with your application (e.g. too much logging to file).

In my case, it appears to be a bug in Azure Functions, and I filed a bug here.

The function starts consuming tens of thousands of read and write transactions after any code change, however minor. So basically each code change or deployment costs me perhaps around $0.20, and it could be more in your case.

This is easy to see in the Metrics diagram because it looks like a huge spike in transactions.

So the solution is: don't write logs to the filesystem and don't deploy often.

enter image description here

1

It is interesting and unusual that your storage cost is so much higher. I think the dashboard logging is a likely culprit, so it would be good understand if you see a drop over the next few days with it turned off.

I would spend a bit more time in the cost analysis section of the Azure Portal to see if you can get more details about exactly which aspect of your storage usage is driving the majority of the cost. i.e. is it about table operations, blob operations, etc. This screenshot shows the Cost History view with a breakdown per meter. Note the tooltip in this screenshot:

enter image description here

The $Metrics tables are not written by Azure Functions, they are generated by Azure Storage itself. I would be surprised if these metrics were contributing significantly to your overall cost. But if you want to experiment, I think you can disable them through this UX:

image

To give you a baseline on what sort of ratio of storage costs to functions execution cost is expected, you might want to take a look at the cost write up I did in this blog post: https://blogs.msdn.microsoft.com/appserviceteam/2017/09/19/processing-100000-events-per-second-on-azure-functions/

You'll notice that the storage costs were less than functions, and that includes a significant number of storage operations due to event hubs processing requiring checkpoints written to storage. I'll note that these tests were run with dashboard logging off (again making me suspect that as the main cost driver). So no, it is NOT normal for your storage costs to be 7x your functions cost!

Paul Batum
  • 8,165
  • 5
  • 40
  • 45
  • I believe I have already disabled dashboard logging by removing the `AzureWebJobsDashboard` app setting, right? And I did also dig into the details of the consumption. It's "File Write Operation Units" and "File Protocol Operation Units" that are the main spenders. I'm thinking about creating a new function and moving my code to that function to check if that changes anything. I already tried creating a new storage account, to check if it were another service that caused this. But it didn't help. – ThomasArdal Apr 27 '18 at 06:21
  • 1
    I've deployed the code to a new function app. Will follow up in two days and post the results here. Thank you so far! – ThomasArdal Apr 27 '18 at 07:10
  • interesting! So file units come from azure files usage. Your function app's file system is backed by azure files. But there should not be file write operations, unless you are deploying constantly or you are writing to the file system from your function code? I guess the other possibility is that your have file system logging force enabled. Can you check your host.json? You're looking for a setting called "fileLoggingMode" - if that is set to "always" then thats the culprit. – Paul Batum Apr 27 '18 at 18:58
  • 1
    I believe creating a new function and deploying the code to that has fixed the problem. I'll follow up again in a couple of days, to see if storage costs are still as low as now. If you want to look into the configuration of the old function, let me know. – ThomasArdal Apr 29 '18 at 06:24
  • i think they key aspect i'm still interested in is whether your host.json specifies a file logging mode as to me its the most likely explanation.. – Paul Batum Apr 30 '18 at 23:45
  • My host.json contains an ID only. – ThomasArdal May 01 '18 at 05:38
  • hmm, that disproves my primary theory. Let me know if anything new comes up. – Paul Batum May 02 '18 at 21:05
  • 1
    I won't be digging more into this issue, since costs are still low. If you want to look more into it, feel free and I'll also help you extract whatever info you need. I'm guessing this can happen for everyone using Functions and while it's great for Azure profits, it's not great for the customers ;) – ThomasArdal May 03 '18 at 05:52