I'm creating a real-time data fusion pipeline where the Sink is a HTTP plugin call to Vertex AI endpoint in another GCP project. The request body will be provided by a previous step in the pipeline. The http sink plugin being used (HTTP v1.2.2) doesn't seem to support any oauth parameters. what is the best way to make that HTTP call with a dynamically generated token in the headers? any help is appreciated. Thank you
Asked
Active
Viewed 236 times
1 Answers
0
As of now, there is no way to achieve this. I also faced the same issue where my OAuth token expires in X days.
I've had to make a dynamic pipeline that doesn't fail so I have used a custom Argument setter and used the token(macro
) that the custom args setter initializes in the HTTP plugin.
You can find the actual open-source code at the https://github.com/data-integrations/argument-setter

Zahid Khan
- 2,130
- 2
- 18
- 31
-
Thank you for the response. I extended the Google's http-sink plugin to enable OAuth so I'm unblocked with my custom plugin. – hansa29 Nov 15 '22 at 16:10
-
The issue with that approach is, that your custom http sink gets oauth token for each record. Say you have 1000 records you are calling Oauth 1000 times – Zahid Khan Nov 16 '22 at 05:23
-
the pipeline is a real-time pipeline with 1ms expected latency and each event stream (each pubsub message) is a http request to the endpoint. – hansa29 Nov 16 '22 at 10:42
-
Interesting! It wasn't mentioned in the question. So you are dealing with the realtime pipeline and not the batch? Are yew currently able to process within 1ms? And whats the value you are getting from PubSub? – Zahid Khan Nov 16 '22 at 16:18
-
I did mention in the first line of question (I'm creating a **real-time** data fusion pipeline). The new target latency is 5ms which is still to be reached, playing a bit with resources. It's a json message body and attributes which are HashMap type are received from pubsub – hansa29 Nov 22 '22 at 13:06