-1

I have two python files, testing_1.py and testing_2.py. Then, I have created the configmap with testing-config1 to store testing_1.py and testing-config2 to store testing_2.py respectively.

In Kubernetes yaml,

...
      containers:
      - name: spark-master
        image: bde2020/spark-master:2.4.4-hadoop2.7
        volumeMounts:
        - name: testing
          mountPath: /jobs
        - name: testing2
          mountPath: /jobs
      volumes:
        - name: testing
          configMap:
            name: testing-config1
        - name: testing2
          configMap:
            name: testing-config2
...

In the final result, the path only contains testing_1.py.

Hong
  • 365
  • 4
  • 14

2 Answers2

2

You can do it by providing subPath while specifying path. Change it as below:

containers:
      - name: spark-master
        image: bde2020/spark-master:2.4.4-hadoop2.7
        volumeMounts:
        - name: testing
          mountPath: /jobs/testing_1.py
          subPath: testing_1.py
        - name: testing2
          mountPath: /jobs/testing_2.py
          subPath: testing_2.py
      volumes:
        - name: testing
          configMap:
            name: testing-config1
        - name: testing2
          configMap:
            name: testing-config2
Anmol Agrawal
  • 814
  • 4
  • 6
1

You can put both files in the same ConfigMap.

Or, use a projected volume:

A projected volume maps several existing volume sources into the same directory.

Currently, the following types of volume sources can be projected:

  • secret
  • downwardAPI
  • configMap
  • serviceAccountToken
gears
  • 690
  • 3
  • 6