I'm getting error while I'm trying to transfer file from Google cloud storage to google big query. This is the error :
failed with error INVALID_ARGUMENT : Error while reading data, error message: JSON table encountered too many errors, giving up. Rows: 1; errors: 1. Please look into the errors[] collection for more details.
When I look the error I have this :
Error while reading data, error message: JSON parsing error in row starting at position 0: No such field: _ttl.
I don't understand where is the problem. if I just try to transfer 1 files, the data is sent to my Bigquery table. but I need to transfer all the files every days. This is my data format example :
{"createdAt":"2021-12-07T12:07:44.547Z","_lastChangedAt":1638878864561,"isMain":true,"__typename":"Accounting","name":"main","belongTo":"siewecarine","id":"d00ae4ad-c661-40b8-9e90-e0f53b2211fb","_version":1,"updatedAt":"2021-12-07T12:07:44.547Z"}
{"createdAt":"2021-12-07T12:09:12.583Z","_lastChangedAt":1638878952618,"isMain":false,"__typename":"Accounting","name":"test1","belongTo":"mbappe","id":"ee42db80-a6f4-400c-a089-061cc7eec967","_version":1,"updatedAt":"2021-12-07T12:09:12.583Z"}
My data come from a S3 bucket and I batch it from dynamodb.