0

when I am going to read data from a bigquery table and write it in to the another bigquery table using batch dataflow using apache beam and java.

A temp dataset created in to the same project however temptable expired after 1 day but temp dataset still part of the Biquery .

I am using batch dataflow using template creation in google cloud.

if anyone face this kind of issue please reply .

1 Answers1

0

That is because expiration time could be set only in tables not in datasets, you can get the information about your temp dataset using the commands :

bq ls -a
bq show --format=prettyjson $TEMP_DATASET

There you will find the "defaultPartitionExpirationMs" and "defaultTableExpirationMs" but anything related to "expiration dataset".

Enrique Zetina
  • 825
  • 5
  • 16