2

I modified the default hello_pubsub (Python3.7) function of "Google Cloud functions" service, for connecting to a dataset table which I have created in the "Google Cloud BigQuery" service. But, after many different approaches, I'm clueless on, how to make this function connect to the dataset I created in BigQuery. I am sure there is an error only with SQLite code of the function. Someone, please help me

My code is:

import sqlite3
import base64

def hello_pubsub(event, context):
    """Triggered from a message on a Cloud Pub/Sub topic.
    Args:
         event (dict): Event payload.
         context (google.cloud.functions.Context): Metadata for the event.
    """
    connection = sqlite3.connect("<BigQueryDataset>")
    crsr = connection.cursor()
    crsr.execute("SELECT * FROM `iotcoretutorial-xxxxxx.DHT11.DHT11Data` WHERE temperature > 24") 
    ans= crsr.fetchall()    

    pubsub_message = base64.b64decode(event['data']).decode('utf-8')
    print("Hello "+pubsub_message)
    print(ans)

PS: Here iotcoretutorial-xxxxxx refers to the project ID. I used xxxxxx, to hide my project's identity (Please bear with me for that!)

In short iotcoretutorial-xxxxxx.DHT11.DHT11Data is a table created by me on "Google Cloud BigQuery" with temperature and humidity values, which I want to print using hello_pubsub function and do some action if temperature value > 24

saikumar
  • 179
  • 1
  • 3
  • 12
  • 1
    Why are you trying to connect to a `bigquery` dataset with the `sqlite3` library? You should probably look at the [BigQuery Client Libraries](https://cloud.google.com/bigquery/docs/reference/libraries) – PyPingu Jul 31 '19 at 11:16
  • I'd suggest you change the title. This is not helping you now and it's not going to help anybody else in the future. – tmt Jul 31 '19 at 11:36

1 Answers1

3

Try using the BigQuery Client Libraries. More docs here too.

import base64
from google.cloud import bigquery

def hello_pubsub(event, context):
    """Triggered from a message on a Cloud Pub/Sub topic.
    Args:
         event (dict): Event payload.
         context (google.cloud.functions.Context): Metadata for the event.
    """
    client = bigquery.Client(project="<your-project>")

    query_str = "SELECT * FROM `iotcoretutorial-xxxxxx.DHT11.DHT11Data` WHERE temperature > 24"

    job = client.query(
        query_str,
        # Location must match that of the dataset(s) referenced in the query.
        location="<your-location>"
    )
    #Wait for job to finish
    job.result()
    #Get results as a dataframe
    #This requires pandas
    #You can do something different with your results here if you want
    ans_df = job.to_dataframe()

    pubsub_message = base64.b64decode(event['data']).decode('utf-8')
    print("Hello " + pubsub_message)
    print(ans_df)
PyPingu
  • 1,697
  • 1
  • 8
  • 21
  • Just to inform! we need to add following statement `**google-cloud-bigquery = **` ((in my case version is 1.17.0)) in requirements .txt – saikumar Aug 01 '19 at 11:00
  • Yes, you would need to add that library to your `requirements.txt` if you are using one – PyPingu Aug 01 '19 at 11:01
  • Can you please tell me how to test it? like printing stuff before triggering it. Also my last question is, I want to send a configuration update to an ESP32 (blink it's LED) whenever my temperature goes above 24. How to do this?Thanks in advance – saikumar Aug 01 '19 at 11:37
  • What do you mean printing before triggering? Printing where? And I'm afraid I don't know anything about `ESP32` or LEDs. – PyPingu Aug 01 '19 at 11:47
  • Printing the data only with temperature > 24, in Google Cloud Shell! And thanks for helping me the best you can – saikumar Aug 01 '19 at 12:58
  • Isn't that what those `print` statements do? I'm sorry I haven't used Pub/Sub much – PyPingu Aug 01 '19 at 13:14