I am trying to run probably the most basic kubeflow pipeline as described at this link: https://www.kubeflow.org/docs/components/pipelines/sdk/python-function-components/
The Pipeline is just calling an "add" function 2 times inside a dsl pipelines. Here is the code:
def add(a: float, b: float) -> float:
print('Adding 2 numbers:' + str(a) + ' and ' + str(b))
return a + b
add_op = create_component_from_func(
add, output_component_file='add_component.yaml')
@dsl.pipeline(name='Addition pipeline',
description='An example pipeline that performs addition calculations.'
)
def add_pipeline(a='1', b='7'):
first_add_task = add_op(a, 4)
second_add_task = add_op(first_add_task.output, b)
arguments = {'a': '7', 'b': '8'}
client.create_run_from_pipeline_func(add_pipeline,
arguments=arguments,
run_name = exp_name + '-' + str(datetime.now()),
experiment_name = exp_name)
The Pipeline is created and even prints the variables passed in the first component. But the status stays as "Running" and second component does not start.
Here is the screenshot:
Can anyone please suggest, how to move forward here. Note that I am able to successfully run one component pipeline on this kfp instance. But multi-component pipelines are getting stuck.
This is the error in wait container:
time="2022-09-06T06:24:49.366Z" level=info msg="Copying
/tmp/outputs/Output/data from container base image layer to
/tmp/argo/outputs/artifacts/add-Output.tgz"
time="2022-09-06T06:24:49.390Z" level=info
msg="/var/run/argo/outputs/artifacts/tmp/outputs/Output/data.tgz ->
/tmp/argo/outputs/artifacts/add-Output.tgz"
time="2022-09-06T06:24:49.390Z" level=error msg="executor error: You
need to configure artifact storage. More information on how to do this
can be found in the docs: https://argoproj.github.io/argo-
workflows/configure-artifact-repository/"
time="2022-09-06T06:24:49.530Z" level=info msg="Create
workflowtaskresults 403"