1

I've started developing my first DataFlow job using Scio, the Scala SDK. The dataflow job will run in streaming mode.

Can anyone advise the best way to deploy this? I have read in the Scio docs they use sbt-pack and then deploy this within a Docker container. I have also read about using DataFlow templates (but not in great detail).

What's best?

Vadim Kotov
  • 8,084
  • 8
  • 48
  • 62
dhaken
  • 55
  • 6

1 Answers1

0

Like for Java and Python version, you can run directly your code on Dataflow by using the dataflow runner and by launching it from your computer (or a VM/function).

If you want to package it for a reutilisation, you can create a template.

You can't run custom container on Dataflow.

guillaume blaquiere
  • 66,369
  • 2
  • 47
  • 76
  • How do you launch it and stop the client from being locked? For a streaming job I want Jenkins to launch the job, wait for it to start and then the pipeline go green. – dhaken Oct 22 '19 at 11:34