Home > front end >  Terraform plan AWS Unauthorized issue
Terraform plan AWS Unauthorized issue

Time:05-30

UPD: The issue is resolved by changing AWS user for terraform run to a user listed in map_users of cluster.

I'm not a DevOps person so sorry for may be a stupid question. Trying to bring some existing terraform configuration to be working but failed on terraform plan step. The used IAM user with access key/secret looks like has enough permissions to access to anything required, but the error is still there so it seems some permissions is missing. Any ideas what it could be?

The error is:

Error: Invalid credentials
│ 
│   with kubernetes_manifest.virtual_service["graphql-api"],
│   on istio.tf line 42, in resource "kubernetes_manifest" "virtual_service":
│   42: resource "kubernetes_manifest" "virtual_service" {
│ 
│ The credentials configured in the provider block are not accepted by the
│ API server. Error: Unauthorized

This is provider.tf:

terraform {
  required_version = ">= 1.1.5"
  required_providers {
    kubernetes = {
      source  = "hashicorp/kubernetes"
      version = ">= 2.11.0"
    }
    helm = {
      source  = "hashicorp/helm"
      version = ">= 2.5.1"
    }
    aws = {
      source  = "hashicorp/aws"
      version = ">= 4.15.1"
    }
  }
}
provider "aws" {
  region = var.region
  access_key = var.aws_key
  secret_key = var.aws_secret
}
data "aws_eks_cluster" "eks" {
  name = var.cluster_name
}
provider "kubernetes" {
  host                   = data.aws_eks_cluster.eks.endpoint
  cluster_ca_certificate = base64decode(data.aws_eks_cluster.eks.certificate_authority[0].data)
  exec {
    api_version = "client.authentication.k8s.io/v1alpha1"
    args        = ["eks", "get-token", "--cluster-name", var.cluster_name]
    command     = "aws"
  }
}
data "aws_caller_identity" "current" {}
provider "helm" {
  kubernetes {
    host                   = data.aws_eks_cluster.eks.endpoint
    cluster_ca_certificate = base64decode(data.aws_eks_cluster.eks.certificate_authority[0].data)
    exec {
      api_version = "client.authentication.k8s.io/v1alpha1"
      args        = ["eks", "get-token", "--cluster-name", var.cluster_name]
      command     = "aws"
    }
  }
}

CodePudding user response:

The solution is to use AWS user listed in map_users configmap for the cluster to run Terraform. Thanks to @MarkoE.

  • Related