Using cloud SDK there is a command that gets you what you want and more:
bq ls --transfer_config --transfer_location=US --format=prettyjson
, more about that here - List Scheduled Queries in BigQuery
Executing this within your command prompt (given google cloud SKD is installed) gives you the following (with red part = scheduled query sql):

Following that you can run this as a shell sub process within Python and parse it out:
import pandas as pd
import json
from subprocess import PIPE, run, call
response = run('bq ls --transfer_config --transfer_location=US --format=prettyjson',
stdout=PIPE,
stderr=PIPE,
universal_newlines=True,
shell=True)
response
Here is first few lines resulting from above:
CompletedProcess(args='bq ls --transfer_config --transfer_location=US --format=prettyjson', returncode=0, stdout='[\n {\n "dataSourceId": "scheduled_query",\...
Then to get to sql you could access output via response.stdout
and parse as json and then dictionary your way in to desired results or get it into pandas dataframe format and go from there like below:
data = json.loads(response.stdout)
df = pd.json_normalize(data)
df.columns =
dataSourceId
datasetRegion
destinationDatasetId
disabled
displayName
name
schedule
state
updateTime
userId
emailPreferences.enableFailureEmail
params.destination_table_name_template
### sql located in this one
params.query
params.write_disposition
scheduleOptions.startTime
params.overwrite_destination_table
params.source_dataset_id
params.source_project_id
scheduleOptions.endTime
nextRunTime