I am trying to access an EKS cluster which was created earlier with Terraform through Azure Devops pipeline for testing purposes. This Pipeline runs on an agent in AWS which is not publicly available to ssh in. When I try to access the cluster I get "kubectl error: you must be logged in to the server (unauthorized)".
I undertand that When an Amazon EKS cluster is created, the IAM entity (user or role) that creates the cluster is added to the Kubernetes RBAC authorization table as the administrator. Initially, only that IAM user can make calls to the Kubernetes API server using kubectl.
I am a federated user assumes admin role in the AWS account. Is there a way to add my role credentials to the cluster to allow access?
AWS STS get caller identity returns my creds as below.
UserId:******
Account:*****
Arn:arn:aws:sts:************:assumed-role/admins/{accountname}
Or if i want to re-create the Cluster same way with Terraform through a pipeline, which credentials should I add to the configuration to be able to access with my current role since I am not able to create a new User.
Any help would be greatly appreciated.