Home > Enterprise >  Can't SSH into EC2 instance launched from an autoscale group
Can't SSH into EC2 instance launched from an autoscale group

Time:05-03

TL;DR: I am spawning an EC2 instance using an autoscale group, and I can connect to it. But I cannot successfully log in to that instance using the SSH key pair I specified in the autoscale group.


I have used Terraform to create an autoscale group to launch an EC2 instance. Here is the autoscale group:

module "ssh_key_pair" {
  source  = "cloudposse/key-pair/aws"
  version = "0.18.3"

  name      = "myproj-ec2"

  ssh_public_key_path = "."
  generate_ssh_key    = true
}

module "autoscale_group" {
  source  = "cloudposse/ec2-autoscale-group/aws"
  version = "0.30.0"

  name      = "myproj"

  image_id                    = data.aws_ami.amazon_linux_2.id
  instance_type               = "t2.small"
  security_group_ids          = [module.sg.id]
  subnet_ids                  = module.subnets.public_subnet_ids
  health_check_type           = "EC2"
  min_size                    = 1
  desired_capacity            = 1
  max_size                    = 1
  wait_for_capacity_timeout   = "5m"
  associate_public_ip_address = true
  user_data_base64            = base64encode(templatefile("${path.module}/user_data.tpl", { cluster_name = aws_ecs_cluster.default.name }))

  key_name = module.ssh_key_pair.key_name

  # Auto-scaling policies and CloudWatch metric alarms
  autoscaling_policies_enabled           = true
  cpu_utilization_high_threshold_percent = "70"
  cpu_utilization_low_threshold_percent  = "20"
}

And the user_data.tpl file looks like this:

#!/bin/bash
echo ECS_CLUSTER=${cluster_name} >> /etc/ecs/ecs.config

# Set up crontab file
echo "[email protected]" >> /var/spool/cron/ec2-user
echo " " >> /var/spool/cron/ec2-user
echo "# Clean docker files once a week" >> /var/spool/cron/ec2-user
echo "0 0 * * 0 /usr/bin/docker system prune -f" >> /var/spool/cron/ec2-user
echo " " >> /var/spool/cron/ec2-user

start ecs

The instance is spawned, and when I SSH into the spawned instance using the DNS name for the first time, I can successfully connect. (The SSH server returns a host key on first connect, the same one listed in the instance's console output. After approving it, the host key is added to ~/.ssh/known_hosts.)

However, despite having created an ssh_key_pair and specifying the key pair's key_name when creating the autoscale group, I am not able to successfully log in to the spawned instance. (I've checked, and the key pair exists in the AWS console using the expected name.) When I use SSH on the command line, specifying the private key half of the key pair created, the handshake above succeeds, but then the connection ultimately fails with:

debug1: No more authentication methods to try. [email protected]: Permission denied (publickey,gssapi-keyex,gssapi-with-mic).

When I use the Connect button in the AWS Console and click the "SSH client" tab, it says:

No associated key pair

This instance is not associated with a key pair. Without a key pair, you can't connect to the instance through SSH.

You can connect using EC2 Instance Connect with just a valid username. You can connect using Session Manager if you have been granted the necessary permissions.

I also can't use EC2 Instance Connect, which fails with:

There was a problem connecting to your instance

Log in failed. If this instance has just started up, wait a few minutes and try again. Otherwise, ensure the instance is running on an AMI that supports EC2 Instance Connect.

I'm using the most_recent AMI with regex amzn2-ami-ecs-hvm.*x86_64-ebs, which as I understand it comes pre-installed with EC2 Instance Connect.

Am I missing a step in the user_data template? I also read something somewhere about the instance's roles possibly affecting this, but I can't figure out how to configure that with an automatically generated instance like this.

CodePudding user response:

What you've posted now, and in your previous questions, is correct. There is no reason why you won't be able to ssh into the instance. You must make sure that you are using myproj-ec2 private ssh key in your ssh command, for example:

ssh -i ./myproj-ec2 ec2-user@<instance-public-ip-address>

Also ec2-instance-connect is not installed on ECS-optimized instances. You would have to manually install it if you want to use it.

p.s. I'm not checking your user_data or any iam roles, as they are not related to your ssh issues. If you have issues with those, new question should be asked.

  • Related