I have multiple service on a single project, lets say i have 2 GAE (Google App Engine) The first one is service_a and the second one is service_b. Both service write logs. I've already label both GAE, but it's seems the label is not included the log cost. Is there any way to differentiate the log cost between service_a and service_b?
-
How long since you created the labels? You must wait until the next billing cycle. – John Hanley May 22 '23 at 05:36
-
@JohnHanley, more than 3 months fyi, this is happened on all service e.g GAE, Cloud Dataflow, Cloud Function,Cloud run, etc – Alexander Chandra May 22 '23 at 06:25
-
Are you exporting the billing data to BigQuery? This [link](https://cloud.google.com/billing/docs/how-to/bq-examples#query-with-labels) shows how to query based on labels. – John Hanley May 22 '23 at 06:52
-
Can you refer to this [audit logs estimate costs](https://cloud.google.com/bigtable/docs/audit-log-estimate-costs) and [Cloud Logging pricing summary](https://cloud.google.com/stackdriver/pricing#logging-costs) which might help you in understanding the Log cost. You can also try by using this [Price calculator](https://cloud.google.com/products/calculator#id=), In this Calculator search for `Cloud Operations product` and enter the values it is asking. Here, you should know the volumes of logs data received to a service. By entering these data values,we can know the log cost of each service. – Hemanth Kumar May 22 '23 at 08:36
-
Are you trying to figure out how much the logs cost for each service? I am not aware of a standard method to do that. The pricing for Cloud Logging is per GB. The Cloud Logging cost details are not broken down by service. – John Hanley May 22 '23 at 19:23
-
@JohnHanley, >Are you trying to figure out how much the logs cost for each service? yes exactly, so currently there are no solution for this? – Alexander Chandra May 23 '23 at 03:12
-
You would need to process each log entry, estimate its size, figure out which service sent the log, and tabulate. Google Cloud does not provide a service that does what you are asking for. – John Hanley May 23 '23 at 03:48
2 Answers
after wasting my time for few days i've found alternative solution for this
on default, you will have 2 log storage & 2 log router sink
- _Default
- _Required (ignore this)
from my understanding
- log storage is a storage that will store your log
- log router sink is a rule that will decided where the log will be stored
on default, your log will be store on _Default log
here the default state on terraform
resource "google_logging_project_bucket_config" "_Default" {
project = data.google_project.project.project_id
location = "global"
retention_days = 30
bucket_id = "_Default"
description = "Default"
}
resource "google_logging_project_sink" "_default" {
name = "_Default"
disabled = false
destination = "logging.googleapis.com/projects/my-project/locations/global/buckets/_Default"
filter = "NOT LOG_ID(\"cloudaudit.googleapis.com/activity\") AND NOT LOG_ID(\"externalaudit.googleapis.com/activity\") AND NOT LOG_ID(\"cloudaudit.googleapis.com/system_event\") AND NOT LOG_ID(\"externalaudit.googleapis.com/system_event\") AND NOT LOG_ID(\"cloudaudit.googleapis.com/access_transparency\") AND NOT LOG_ID(\"externalaudit.googleapis.com/access_transparency\")"
unique_writer_identity = true
}
So the solution is by create the log storage and log router sink for each your service, you can look terraform code below (this case, i have 1 GAE and 1 Dataflow)
resource "google_logging_project_bucket_config" "_Default" {
project = data.google_project.project.project_id
location = "global"
retention_days = 30
bucket_id = "_Default"
description = "Default"
}
resource "google_logging_project_bucket_config" "ingestion_dataflow_logging_bucket" {
project = data.google_project.project.project_id
location = "global"
retention_days = 30
bucket_id = "ingestion_dataflow"
description = "ingestion dataflow logging bucket"
}
resource "google_logging_project_bucket_config" "gae_minerva_logging_bucket" {
project = data.google_project.project.project_id
location = "global"
retention_days = 30
bucket_id = "gae_minerva"
description = "gae minerva logging bucket"
}
resource "google_logging_project_sink" "_default" {
name = "_Default"
disabled = false
destination = "logging.googleapis.com/projects/my-project/locations/global/buckets/_Default"
filter = "NOT LOG_ID(\"cloudaudit.googleapis.com/activity\") AND NOT LOG_ID(\"externalaudit.googleapis.com/activity\") AND NOT LOG_ID(\"cloudaudit.googleapis.com/system_event\") AND NOT LOG_ID(\"externalaudit.googleapis.com/system_event\") AND NOT LOG_ID(\"cloudaudit.googleapis.com/access_transparency\") AND NOT LOG_ID(\"externalaudit.googleapis.com/access_transparency\")"
exclusions {
name = "dataflow-ingestion-exclusions"
disabled = false
filter = "resource.type=\"dataflow_step\" AND jsonPayload.worker:\"dataflow_ingestion_worker\""
}
exclusions {
name = "gae-minerva-exclusions"
disabled = false
filter = "resource.type=\"gae_app\" AND resource.labels.module_id:\"minerva_project\""
}
unique_writer_identity = true
}
resource "google_logging_project_sink" "ingestion_dataflow_logging_sink" {
name = "ingestion_dataflow"
disabled = false
destination = "logging.googleapis.com/projects/my-project/locations/global/buckets/ingestion_dataflow"
filter = "resource.type=\"dataflow_step\" AND jsonPayload.worker:\"dataflow_ingestion_worker\""
unique_writer_identity = true
}
resource "google_logging_project_sink" "gae_minerva_logging_sink" {
name = "gae_minerva"
disabled = false
destination = "logging.googleapis.com/projects/my-project/locations/global/buckets/gae_minerva"
filter = "resource.type=\"gae_app\" AND resource.labels.module_id:\"minerva_project\""
unique_writer_identity = true
}
by creating storage bucket for each services, you can see the log size. therefore you can estimate the log cost for each service

- 587
- 2
- 9
- 23
Can you try by exploring this Price Calculator as a workaround. In this Calculator search for cloud Operations product
and enter the values it is asking. Here, you should know the volumes of logs data received to a service. By entering these data values , we can know the log cost of each service.
Below is an example given for your reference :
If service A received 100 Gib volume of logs then we need to enter the value as shown in the below screenshot- 1.
Post entering the value and then clicking on estimate we will receive the estimated cost as shown in below screenshot-2.
Similarly we need to check for the Service B. This is just a suggestion for your query
And also As per @JohnHanley comments, if there is no such method. Raise this as a feature request at the Public Issue Tracker report with the description of your issue . This Issue Tracker is a forum for end users to report bugs and request features to improve google cloud products. Product engineering from Google will work on this implementation.

- 2,728
- 1
- 4
- 19