I'm using google cloud datalab for my ML project. One of my data is in the bigquery table that has millions of records (text data) with many columns. I created a pandas dataframe from the bigquery table, converted it to a dask dataframe (with 5 partitions) and performed data wrangling.
Now I have this dask dataframe that I want to store it in bigquery or convert it into parquet files and store them in my gcp storage. It would be great to hear options from the community. Thanks.