Maybe it's not exactly you were looking for, but the following options are the closest workarounds to what you're trying to achieve.
First of all, you can save your data in an CSV file with one column locally and then load that file into BigQuery. There are also other file formats that can be loaded into BigQuery from a local machine, that might interest you. I personally would go with a CSV.
I did the experiment, by creating an empty table in my dataset, without adding a field. Then I used the code mentioned in the first link, after saving a column of my random data in a CSV file.
If you encounter the following error regarding the permissions, see this solution. It uses an authentication key instead.
google.api_core.exceptions.Forbidden: 403 GET https://bigquery.googleapis.com/bigquery/v2/projects/project-name/jobs/job-id?location=EU: Request had insufficient authentication scopes.
Also, you might find this link useful, in case you get the following error:
google.api_core.exceptions.BadRequest: 400 Provided Schema does not match Table my-project:my_dataset.random_data. Cannot add fields (field: double_field_0)
Besides loading your data from a local file, can upload your data file on Google Cloud Storage and load the data from there. Many file formats are being supported, as Avro, Parquet, ORC, CSV and newline delimited JSON.
Finally there is an option for streaming the data directly into a BigQuery table by using the API, but it is not available via the free tier.