1

I have a certain query which is working fine. The problem is that a part of that query is a string that needs to be read from a file. Query for each string produces 6 outputs. I need a union of all the results for that file such that the end result is a table fo 6x number of strings. I can read the file using Python.

I've already tried using parameterised queries. Each of them only return the 6 rows based on the string.

Most of my Python code is based on BigQuery's documentation here.

query = """
    SELECT pet_id, age, name
    FROM `myproject.mydataset.mytable`
    WHERE name = @name
    AND species = @species;
"""
query_params = [
    bigquery.ScalarQueryParameter('name', 'STRING', 'Max'),
    bigquery.ScalarQueryParameter('species', 'INT64', 'Dog'), 
    bigquery.ScalarQueryParameter('name', 'STRING', 'Alfred'), 
    bigquery.ScalarQueryParameter('species', 'INT64', 'Cat')
]
job_config = bigquery.QueryJobConfig()
job_config.query_parameters = query_params
query_job = client.query(
    query,
    # Location must match that of the dataset(s) referenced in the query.
    location='US',
    job_config=job_config)  # API request - starts the query

# Print the results
for row in query_job:
    print('{}: \t{}'.format(row.word, row.word_count))

How can I get a UNION ALL of many of these query results?

The output should look like

pet_id | age | name
___________________
1      | 5   | Max
2      | 8   | Alfred

Pranay Nanda
  • 179
  • 3
  • 21

1 Answers1

1

Please look at below example using public data (you can run the query as well)

#standardSQL
SELECT * 
FROM `bigquery-public-data.baseball.schedules`
WHERE (year, duration_minutes) IN UNNEST([(2016, 187), (2016, 165), (2016, 189)])

The key here is for you to provide an array of value that you want to filter the table with, and use IN UNNEST(array_of_values) to do the job, ideally like below:

query = """
    SELECT pet_id, age, name
    FROM `myproject.mydataset.mytable`
    WHERE (name, species) IN UNNEST(@filter_array);
"""

It is a bit unfortunate that BigQuery Python API doesn't let you specify array< struct<string, int64> > as query parameter. So you may have to do:

query = """
    SELECT pet_id, age, name
    FROM `myproject.mydataset.mytable`
    WHERE concat(name, "_", species) IN UNNEST(@filter_array);
"""
array_of_pre_concatenated_name_and_species = ['Max_Dog', 'Alfred_Cat']
query_params = [
    bigquery.ArrayQueryParameter('filter_array', 'STRING', array_of_pre_concatenated_name_and_species),
]
Yun Zhang
  • 5,185
  • 2
  • 10
  • 29
  • My query is more like `WITH str AS ( SELECT 'Max' AS STRING) SELECT name from main_table as mt INNER JOIN str ON mt.name == str.STRING`. The string 'Max' is derived from the file. Where should I put the `IN UNNEST(@filter_array)` statement? – Pranay Nanda Apr 03 '19 at 10:15
  • For now, I've ingested those strings from the file to a table. That way I've got that `INNER JOIN` working. It could be better if I did not have to ingest. – Pranay Nanda Apr 03 '19 at 12:06
  • @PranayNanda, in above example, your strings are put into this array `array_of_pre_concatenated_name_and_species = ['Max_Dog', 'Alfred_Cat']` – Yun Zhang Apr 03 '19 at 15:14