0

I am trying to upload ca-certificate stored in my windows location to docker container but while running the build image it is failing.

I am not getting how to copy the certificate from windows location to linux recommended location.

FROM amazonlinux:2
FROM amazoncorretto:8
FROM maven:3.6-amazoncorretto-8

RUN yum install -y procps
WORKDIR /tmp/
ADD pom.xml /tmp
RUN cp ~/Users/<username>/<bundle>.pem  /etc/pki/ca-trust/source/anchors/
RUN update-ca-trust
#RUN curl -O   ./spark-3.1.1-bin-without-hadoop.tgz  https://archive.apache.org/dist/spark/spark-3.1.1/spark-3.1.1-bin-without-hadoop.tgz
RUN curl    -O  ./spark-3.1.1-bin-without-hadoop.tgz  -I https://archive.apache.org/dist/spark/spark-3.1.1/spark-3.1.1-bin-without-hadoop.tgz
RUN tar -xzf spark-3.1.1-bin-without-hadoop.tgz && \
    mv spark-3.1.1-bin-without-hadoop /opt/spark && \
    rm spark-3.1.1-bin-without-hadoop.tgz
RUN mvn dependency:copy-dependencies -DoutputDirectory=/opt/spark/jars/
RUN rm /opt/spark/jars/jsr305-3.0.0.jar && \
    rm /opt/spark/jars/jersey-*-1.19.jar && \
    rm /opt/spark/jars/jackson-dataformat-cbor-2.6.7.jar && \
    rm /opt/spark/jars/joda-time-2.8.1.jar && \
    rm /opt/spark/jars/jmespath-java-*.jar && \
    rm /opt/spark/jars/aws-java-sdk-core-*.jar && \
    rm /opt/spark/jars/aws-java-sdk-kms-*.jar && \
    rm /opt/spark/jars/aws-java-sdk-s3-*.jar && \
    rm /opt/spark/jars/ion-java-1.0.2.jar

RUN echo $'\n\
spark.eventLog.enabled                      true\n\
spark.history.ui.port                       18080\n\
' > /opt/spark/conf/spark-defaults.conf

ENTRYPOINT ["/bin/bash", "-c"]:


cp: cannot stat ‘/root/Users/<username>/<bundle>.pem’: No such file or directory

Without this when I am trying to run using , it gives warning 0curl: (6) Could not resolve host:

and then could not run next command because of error spark-3.1.1-bin-without-hadoop.tgz: Cannot open: No such file or directory

RUN curl    -O --insecure  ./spark-3.1.1-bin-without-hadoop.tgz  -I https://archive.apache.org/dist/spark/spark-3.1.1/spark-3.1.1-bin-without-hadoop.tgz

RUN tar -xzf spark-3.1.1-bin-without-hadoop.tgz

pbh
  • 186
  • 1
  • 9

1 Answers1

0

If your pem file is outside of Docker and you are trying to copy it into Docker, You need to put the pem file besides your Dockerfile and use ADD or COPY instruction:

COPY bundle.pem /etc/pki/ca-trust/source/anchors/

and always its better to not to use relative path ~ in Dockerfiles.

Aref Riant
  • 582
  • 3
  • 14