4

I am trying to run a shell script at regular interval of 1 minute using a CronJob.

I have created following Cron job in my openshift template:

- kind: CronJob
  apiVersion: batch/v2alpha1
  metadata:
    name: "${APPLICATION_NAME}"
  spec:
    schedule: "*/1 * * * *"
    jobTemplate:
      spec:
        template:
          spec:
            containers:
            - name: mycron-container
              image: alpine:3
              imagePullPolicy: IfNotPresent

              command: [ "/bin/sh" ]
              args: [ "/var/httpd-init/croyscript.sh" ]
              volumeMounts:
              - name: script
                mountPath: "/var/httpd-init/"
            volumes:
            - name: script
              configMap:
                name: ${APPLICATION_NAME}-croyscript
            restartPolicy: OnFailure
            terminationGracePeriodSeconds: 0

    concurrencyPolicy: Replace

The following is the configmap inserted as a volume in this job:

- kind: ConfigMap
  apiVersion: v1
  metadata:
    name: ${APPLICATION_NAME}-croyscript
    labels:
      app: "${APPLICATION_NAME}"
  data:
    croyscript.sh: |
      #!/bin/sh
      if [ "${APPLICATION_PATH}" != "" ]; then
          mkdir -p /var/httpd-resources/${APPLICATION_PATH}
      fi
      mkdir temp
      cd temp 
      ###### SOME CODE ######

This Cron job is running. as I can see the name of the job getting replaced every 1 min (as scheduled in my job). But it is not executing the shell script croyscript.sh Am I doing anything wrong here? (Maybe I have inserted the configmap in a wrong way, so Job is not able to access the shell script)

Harshit Goel
  • 175
  • 1
  • 3
  • 13
  • Are you sure it is not run at all ? Or you only cannot see the directory that it is supposed to create ? Is there anything in events when you run `kubectl describe job `. Try to add some logging to your script like echoing some message to make sure it is not run at all. – mario Nov 05 '19 at 16:20
  • @mario It is not able to create the pod at all. Cron Job is running as I can see the result when I query 'oc get jobs': NAME : web-croy-1573015980 DESIRED : 1 SUCCESSFUL : 0 AGE : 19s – Harshit Goel Nov 06 '19 at 04:55
  • Running 'Kubectl logs web-croy-1573017240-ng496', output is : Error from server: container "mycron-container" in pod "web-croy-1573017240-ng496" is waiting to start: ContainerCreating Can it be a problem because of Kubectl version?? My Kubectl server version is 1.7 – Harshit Goel Nov 06 '19 at 05:15
  • version 1.7 is already quite obsolete and isn't supported any more so definitely I would recommend you to upgrade it to the newer version. Current version of kubernetes is 1.16. If the problem persist then you can continue debugging but upgrading your server is a good starting point. Can you see any additional info when you run `kubectl describe job / cronjob ` ? – mario Nov 06 '19 at 12:13

1 Answers1

1

Try below approach

Update permissions on configmap location

            volumes:
            - name: script
              configMap:
                name: ${APPLICATION_NAME}-croyscript
                defaultMode: 0777

If this one doesnt work, most likely the script in mounted volume might have been with READONLY permissions. use initContainer to copy the script to different location and set appropriate permissions and use that location in command parameter

P Ekambaram
  • 15,499
  • 7
  • 34
  • 59