Home > Mobile >  Give cluster admin access to EKS worker nodes
Give cluster admin access to EKS worker nodes

Time:09-20

We have a EKS cluster running the 1.21 version. We want to give admin access to worker nodes. We modified the aws-auth config map and added "system:masters" for eks worker nodes role. Below is the code snipped for the modified configmap.

data:
  mapRoles: |
    - groups:
      - system:nodes
      - system:bootstrappers
      - system:masters
      rolearn: arn:aws:iam::686143527223:role/terraform-eks-worker-node-role
      username: system:node:{{EC2PrivateDNSName}}

After adding this section, the EKS worker nodes successfully got admin access to the cluster. But in the EKS dashboard, the nodegroups are in a degraded state. It shows the below error in the Health issues section. Not able to update cluster due to this error. Please help.

Your worker nodes do not have access to the cluster. Verify if the node instance role is present and correctly configured in the aws-auth ConfigMap.

CodePudding user response:

The error message indicates that the instance role (terraform-eks-worker-node-role) is lack of AWS managed policy AmazonEKSWorkerNodePolicy. Here's a troubleshooting guide for reference.

To provide cluster admin to your agent pod; bind the cluster-admin role to your agent pod serviceaccount:

apiVersion: rbac.authorization.k8s.io/v1
kind: RoleBinding
metadata:
  name: <of your own>
  namespace: <where your agent runs>
roleRef:
  apiGroup: rbac.authorization.k8s.io
  kind: ClusterRole
  name: cluster-admin
subjects:
- kind: ServiceAccount
  name: <use by your agent pod>
  namespace: <where your agent runs>
  • Related