I have a task to implement a backend service(Java/Spring) running in Cloud Run and create (configure) a data pipeline in GCP that uses this service for automatic processing of Avro files with embedded schema that are being uploaded to a Cloud Storage bucket. When new file is uploaded to bucket, I need to process it and parse it to BigQuery in specific way.
So, I have successfully deployed Spring application and designed avro schema. I find out that google has it example on how to load avros to BigQuery example, I think this can be applied for this task.
I stuck on the uploading event(or data pipeline configuration maybe?). I really don't know how to handle file uploading events(I suppose I need to get URI of new file when it is uploaded). I tried to read about Google Dataflow, but I don't think this is what I need for my task. Could you please give me some advice on how I should do this.