0

I'm trying to use the stdout from a step as file input from another step.

Since the output is pretty big, I'm getting the error argument list too long.

...
spec:
  templates:
  - name: main
    steps:
    - - name: big-output
...
    - - name: print
        template: head-query
        arguments:
          parameters:
          - name: query-result
            raw:
              data: "{{steps.big-output.outputs.result}}"
  - name: head-query
    inputs:
      parameters:
      - name: query-result
        path: /input/query.txt
        raw:
          data: "{{inputs.parameters.query-result}}"
    container:
      image: alpine
      command: [head]
      args:
      - /input/query.txt

What is the proper way to put the stdout in a file? Is there some way to avoid modifying the step with the big output?

Federico Nafria
  • 1,397
  • 14
  • 39

1 Answers1

0

Your approach should work, as long as the output doesn't exceed the 256kb limit for output parameters.

The Workflow as it is written is invalid, because raw is meant to be used with artifacts rather than parameters.

If you were to run argo lint you would get an error like this:

   ✖ in "big-parameter-" (Workflow): json: unknown field "path"

✖ 1 linting errors found!

Modifying the Workflow manifest to use artifacts instead of parameters should allow it to work.

apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
  generateName: big-parameter-
spec:
  entrypoint: main
  templates:
    - name: main
      steps:
        - - name: big-output
            template: big-output
        - - name: print
            template: head-query
            arguments:
              artifacts:
                - name: query-result
                  raw:
                    data: "{{steps.big-output.outputs.result}}"
    - name: big-output
      script:
        image: alpine
        command:
          - sh
        source: |
          echo "pretend this is really big"
    - name: head-query
      inputs:
        artifacts:
          - name: query-result
            path: /input/query.txt
      container:
        image: alpine
        command: [head]
        args:
          - /input/query.txt
crenshaw-dev
  • 7,504
  • 3
  • 45
  • 81
  • I'm still getting the `argument list too long` error. After checking the generated yaml in the cluster, I can see that it's adding the template as an environment variable `ARGO_TEMPLATE` with the value of the parameter in it; making it huge. I'm guessing there is no way to treat the output purely as a file. – Federico Nafria Feb 17 '22 at 08:49