I have an Argo workflow that loops over a JSON array. When the list gets too large, I get an error like this:
time="some-time" level=fatal msg="Pod \"some-pod-name\" is invalid: metadata.annotations: Too long: must have at most 262144 characters"
Or, in newer versions of Argo:
Output is larger than the maximum allowed size of 256 kB, only the last 256 kB were saved
How can I loop over this large JSON array without hitting the size limit?
My workflow looks a bit like this, but with a bigger JSON array:
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: loops-sequence-
spec:
entrypoint: loops-sequence
templates:
- name: loops-sequence
steps:
- - name: get-items
template: get-items
- - name: sequence-param
template: echo
arguments:
parameters:
- name: item
value: "{{item}}"
withParam: "{{steps.get-items.outputs.parameters.items}}"
- name: get-items
container:
image: alpine:latest
command: ["/bin/sh", "-c"]
args: ["echo '[\"a\", \"b\", \"c\"]' > /tmp/items"]
outputs:
parameters:
- name: items
valueFrom:
path: /tmp/items
- name: echo
inputs:
parameters:
- name: item
container:
image: stedolan/jq:latest
command: [echo, "{{inputs.parameters.item}}"]