I would like to create a bigquery udf to transport the data from bigquery to memorystore. To do this I think I should follow 3 steps:
def get_column_from_table(keys) .... I should bring my main key here from bigquery
def get_field_value(field, value) if the 3rd element is an array, it should be broken into as many elements as the array has and added as many elements to the memorystore. Here I should reverse the process
def get_field_value(field, value):
element= get_column_from_table(key) keys = list(element.keys()) uniq_keys = list(dict.fromkeys([x.partition(":")[0] for x in keys])) result = {} for uniq in uniq_keys: value = None for key in keys: if key.partition(":")[0] == uniq: if value is not None: value = value + "\n" + element[key] else: value = element[key] result[uniq] = value if field in result.keys(): return result[field] else: return ""
def_post(): ....
x=hset(...)
It is not very clear to me how I should take the first step. Also, I intend for the query to be of the following type in the cloud function: select .....
(key, field)
If I only enter the key to retrieve all my data (and at the same time I should treat the data from the table that are null to be ignored) or to be able to add a certain column. Any sugestion/documentation/example please?