I'm running a simple bigQuery over my dataset which is about 84GB of log data.
The query takes approx 110 seconds to complete. Is this normal for a data set of this size?
I'm running a simple bigQuery over my dataset which is about 84GB of log data.
The query takes approx 110 seconds to complete. Is this normal for a data set of this size?
After further investigation, it looks like your table was heavily fragmented. We usually have a coalesce process running to prevent this situation, but it had been off for a couple of weeks while we were verifying a bug fix. I've restarted the coalescer and run it against your table. Please let me know if you continue to see poor performance.
As a best practice, you may be better off importing somewhat less frequently in larger chunks, or splitting your data into time-based tables. BigQuery isn't really designed to handle high-volume small imports to the same table.