After the end of Apache Beam (Google Cloud Dataflow 2.0) job, we get a readymade command at the end of logs bq show -j --format=prettyjson --project_id=<My_Project_Id> 00005d2469488547749b5129ce3_0ca7fde2f9d59ad7182953e94de8aa83_00001-0
which can be run from the Google Cloud SDK command prompt.
Basically it shows all the information like Job start time, end time, number of bad records, number of records inserted,etc.
I can see these information on the Cloud SDK console but where these information is stored?
I checked in the stackdriver logs, it has the data till previous day and even not the complete information which is shown on the Cloud SDK console.
If I want to export these information and load into the BigQuery, where can I get it.
Update : This is possible and I found the information when I added filter resource.type="bigquery_resource"
in the Stackdriver logs viewer but It shows Timestamp information like CreateTime, StartTime and EndTime as 1970-01-01T00:00:00Z