0

I am currently trying to read the message as ByteDeserializer similar to the following example KafkaIO. My test setup is as follows:

Option 1: Configured to use --runner=PortableRunner

Option 2: start the local flink job server,

docker run --net=host apache/beam_flink1.10_job_server:latest

Publish test kafka avro messages

Pipeline args is defined as,

pipeline_args = ['--runner', 'FlinkRunner',
                 '--job_endpoint', 'localhost:8099',
                 '--environment_type', 'LOOPBACK',
                 '--flink_version', '1.10',
                 '--flink_master', 'localhost:8081']
pipeline_options = PipelineOptions(pipeline_args, save_main_session=True,streaming=True)   

Pipeline setup,

_ = (pipeline | ReadFromKafka(
                consumer_config= {'bootstrap.servers':'localhost:9092'},
                topic = ['beam-test-topic'])
              | beam.Flatmap(lambda kv: log_topic_contents(kv[1])))

when I execute the pipeline, the default expansion service SDK image (apache/beam_python3.7_sdk:2.29.0) is used and the job is submitted to the flink job server. The flink job server fails with a message "Failed to submit JobGraph" and "Rest endpoint shutdown".

Do I miss out any runtime configuration with the pipeline?

Vim
  • 71
  • 3

1 Answers1

0

Cross-language transforms are currently not supported for 'LOOPBACK' environment type. Can you retry with the 'DOCKER' type ?

chamikara
  • 1,896
  • 1
  • 9
  • 6