0

I am aware that using Kafka Connect you can create connector sink all kafka messages to a PUBSUB topic.

I would like to know if I can use Spring Cloud Stream binders in combination with Spring Cloud Function and deploy all this to Google Functions.

In I understand correctly I can combine Spring Cloud Stream and Spring Cloud Function. Does this mean that I can use the binder from Spring Cloud Stream to actualy accept a message from Kafka ?

Alexander Petrov
  • 9,204
  • 31
  • 70
  • About deploying this to `Google Cloud Functions` yes it is possible as mentioned [here](https://cloud.google.com/blog/products/application-development/introducing-java-11-on-google-cloud-functions) using the Java11 runtime for GCF – Soni Sol Sep 08 '20 at 18:26
  • @JoséSoní I dont think you understand the question. The question is not if I can deploy sring-cloud-functions to GCF. This is clear that I can, the question is if I can create a KAFKA binde via spring-cloud-stream together with spring-cloud-function and deploy it to GCF. The idea is that then I dont need the PUBSUB trigger possibly and my KAFKa is directly talking to my function. – Alexander Petrov Sep 08 '20 at 18:46

1 Answers1

0

Google Cloud Functions uses a Trigger model to fire the functions in spring-cloud-function. Kafka as a trigger is not supported.

The workaround is to create use Kafka Connect to feed PUBSUB from a Kafka topic. You can either create your own: https://dev.to/vtatai/make-kafka-serverless-by-connecting-it-to-google-cloud-functions-2ahh

Or use the one supplied: https://github.com/GoogleCloudPlatform/pubsub/tree/master/kafka-connector

Alexander Petrov
  • 9,204
  • 31
  • 70