You didn't give us enough specifics for a specific answer, so let me give you my general way of trying this:
First, let's get the sentiment analysis of one arbitrary sentence with the gcloud
CLI:
gcloud --format json ml language analyze-entity-sentiment --content "It's time we just let this thing go - it was a prett
y good bad idea, wasn't it though? -- Bad Idea, Sara Bareilles" | jq -c . > sentiments.json
Please notice that I removed the formatting of the output JSON with jq
and stored the results in one file.
To load this file with maybe multiple JSON lines for each sentence into BigQuery:
bq load --autodetect --source_format=NEWLINE_DELIMITED_JSON temp.sentiments sentiments.json
The question asks for "stream into BigQuery", but it might make more sense to batch the load like shown here.
Now we have a table with the results in BigQuery:
SELECT * FROM `fh-bigquery.temp.sentiments` LIMIT 1000

Btw, I added Sara Bareilles
to the sentence to make sure that BigQuery got a full schema for auto-detection when creating the table the first time.
If you want to stream data into BigQuery, then look at the streaming into BigQuery docs. I wanted to isolate in this answer the basics of getting and looking at Cloud NLP data into BigQuery - the rest is just the basics of working with it.