I have two VM Instance on GCE with kubernetes self install (using the following https://medium.com/edureka/install-kubernetes-on-ubuntu-5cd1f770c9e4).
I'm trying to create volume and use it in my pods.
I have been created the following disk:
gcloud compute disks create --type=pd-ssd --size=10GB manual-disk-1
And create the following yaml files
pv_manual.yaml:
apiVersion: v1
kind: PersistentVolume
metadata:
name: manually-created-pv
spec:
accessModes:
- ReadWriteMany
capacity:
storage: 10Gi
persistentVolumeReclaimPolicy: Retain
gcePersistentDisk:
pdName: manual-disk-1
pvc_manual.yaml:
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: mypvc
spec:
accessModes:
- ReadWriteMany
resources:
requests:
storage: 10Gi
pod.yaml:
apiVersion: v1
kind: Pod
metadata:
name: sleppypod
spec:
volumes:
- name: data
persistentVolumeClaim:
claimName: mypvc
containers:
- name: sleppycontainer
image: gcr.op/google_containers/busybox
command:
- sleep
- "5000"
volumeMounts:
- name: data
mountPath: /data
readOnly: false
And when I'm trying to create the pod the pode get status ContainerCreating
and on kubectl get events
I see:
7s Warning FailedAttachVolume AttachVolume.NewAttacher failed for volume : Failed to get GCE GCECloudProvider with error
I run my two instances using ServiceAccount with compute instance admin role (according Kubernetes: Failed to get GCE GCECloudProvider with error <nil>) and my kubelet
running with --cloud-provider=gce
How can I solve it?