Is it possible to create a double-container KubeFlow pipeline which steps would be as follows:
Container #1. Container runs and creates an output which is a 'text' argument and stores it as an output txt. Example:
name: some-pipeline
description: Testing if my pipeline works
implementation:
container:
image: url-to-some-image:latest_tag
command: [python3, test-file.py]
Container #2. load_component_from_text(text=), 'text' parameter shall take output from container #1 as an argument.
The pipeline shall run recurringly. Each time Container #1 shall check the latest tag of image and pass it as an 'text' argument to "container #2". Is it possible to do that using kfp/dsl python libraries or what is correct way to solve such problem?