When deploying this step
WriteToBigQuery(
method='STORAGE_WRITE_API',
table=[TABLE],
schema=[PATH TO SCHEMA ON GCS BUCKET],
create_disposition=beam.io.BigQueryDisposition.CREATE_IF_NEEDED,
write_disposition=beam.io.BigQueryDisposition.WRITE_APPEND,
insert_retry_strategy='RETRY_NEVER'
)
getting
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/collections/__init__.py", line 402, in namedtuple
raise ValueError('Field names cannot start with an underscore: '
ValueError: Field names cannot start with an underscore: '_embedded'
Bigquery does support fields starting with underscores. Is this a bug and are there any workarounds?