I am exporting dialogflow interaction logging to bigquery in a table as created as per the structure in the link below
https://cloud.google.com/dialogflow/cx/docs/concept/export-bq
CREATE TABLE <your_dataset_name>.dialogflow_bigquery_export_data(
project_id STRING,
agent_id STRING,
conversation_name STRING,
turn_position INTEGER,
request_time TIMESTAMP,
language_code STRING,
request JSON,
response JSON,
partial_responses JSON,
derived_data JSON,
conversation_signals JSON
);
There are some fields like request,response, derived_data etc are in JSON format.
Wondering what's the best way of analyzing and querying this table? Do I need to flatten the table into another table or multiple tables for each JSON column?
How to achieve data modelling for the given table?
Looking for flattening the data if that's the right approach for analysing table.