I have one eks cluster with fargate compute capacity. Now i am adding eks node group as compute capacity. I have created terraform script to create eks node group and launch a template for new node group.
when I am running terraform script using the eks cluster owner role. I am getting the following error message.
Error: error waiting for EKS Node Group to create: unexpected state 'CREATE_FAILED', wanted target 'ACTIVE'. last error: 1 error occurred:
* : AccessDenied: The aws-auth ConfigMap in your cluster is invalid.
terraform code
#--- setup launch template for eks nodegroups ---#
resource "aws_launch_template" "eks_launch_template" {
name = "launch-template"
key_name = var.ssh_key_name
block_device_mappings {
device_name = "/dev/xvda"
ebs {
volume_size = var.disk_size
}
}
tag_specifications{
resource_type= "instance"
tags = merge(var.tags, { Name = "${local.name_prefix}-eks-node" })
}
tag_specifications{
resource_type= "volume"
tags = var.tags
}
tag_specifications{
resource_type= "network-interface"
tags = var.tags
}
tag_specifications{
resource_type= "spot-instances-request"
tags = var.tags
}
vpc_security_group_ids =[aws_security_group.eks_worker_node_sg.id]
}
#--- setup eks ondemand nodegroup ---#
resource "aws_eks_node_group" "eks_on_demand" {
cluster_name = aws_eks_cluster.eks_cluster.name
node_group_name = "${local.name_prefix}-group"
node_role_arn = aws_iam_role.eks_ec2_role.arn
subnet_ids = var.private_subnets
instance_types = var.nodegroup_instance_types
launch_template {
id = aws_launch_template.eks_launch_template.id
version = aws_launch_template.eks_launch_template.latest_version
}
scaling_config {
desired_size = var.desire_size
max_size = var.max_size
min_size = var.min_size
}
update_config {
max_unavailable = 1
}
tags = var.tags
lifecycle {
ignore_changes = [scaling_config[0].desired_size]
}
}
#--- eks ec2 node iam role ---#
resource "aws_iam_role" "eks_ec2_role" {
name = "${local.name_prefix}-eks-node-role"
assume_role_policy = jsonencode({
Statement = [{
Action = "sts:AssumeRole"
Effect = "Allow"
Principal = {
Service = "ec2.amazonaws.com"
}
}]
Version = "2012-10-17"
})
}
#--- attach workernode policy to ec2---#
resource "aws_iam_role_policy_attachment" "eks_ec2_policy" {
policy_arn = "arn:aws:iam::aws:policy/AmazonEKSWorkerNodePolicy"
role = aws_iam_role.eks_ec2_role.name
}
#--- attach cni policy to ec2---#
resource "aws_iam_role_policy_attachment" "eks_ec2_CNI_Policy" {
policy_arn = "arn:aws:iam::aws:policy/AmazonEKS_CNI_Policy"
role = aws_iam_role.eks_ec2_role.name
}
#-- attach ecr read access policy to ec2 ---#
resource "aws_iam_role_policy_attachment" "eks_ec2_ecr_read_policy" {
policy_arn = "arn:aws:iam::aws:policy/AmazonEC2ContainerRegistryReadOnly"
role = aws_iam_role.eks_ec2_role.name
}
CodePudding user response:
it's due to the configmap : aws-auth in kube-system
namespace, indeed i am not using farget
but looks like it's the same error.
You can ignore or delete the config map with terraforming and try adding the Node group to the cluster.
var.tf
# aws-auth configuration
variable "kubernetes_config_map_ignore_role_changes" {
type = bool
default = true
description = "Set to `true` to ignore IAM role changes in the Kubernetes Auth ConfigMap"
}
main.tf
kubernetes_config_map_ignore_role_changes = var.kubernetes_config_map_ignore_role_changes
Just Module ref : https://github.com/cloudposse/terraform-aws-eks-cluster
Read more about issue : https://www.lisenet.com/2021/aws-eks-access-denied-the-aws-auth-configmap-in-your-cluster-is-invalid/
AWS EKS Open issue : https://github.com/aws/containers-roadmap/issues/185
CodePudding user response:
Issue was in my aws-auth configmap. It looks like aws eks is performing validation on configmap. if your rolemapping contain common username then it eks will throw an error, for example
- groups:
- Dev-viewer
rolearn: arn:aws:iam::<>:role/<>
username: {{SessionName}}
- groups:
- Dev-manager
rolearn: arn:aws:iam::<>:role/<>
username: {{SessionName}}
- groups:
- Dev-admin
rolearn: arn:aws:iam::<>:role/<>
username: {{SessionName}}
I have change username section of each role.
- groups:
- Dev-viewer
rolearn: arn:aws:iam::<>:role/<>
username: view-{{SessionName}}
- groups:
- Dev-manager
rolearn: arn:aws:iam::<>:role/<>
username: manager-{{SessionName}}
- groups:
- Dev-admin
rolearn: arn:aws:iam::<>:role/<>
username: admin-{{SessionName}}