Home > Enterprise >  Bash Script to run AWS Cli command in parallel to reduce time
Bash Script to run AWS Cli command in parallel to reduce time

Time:11-25

sorry i am still new to bash scripting. I have around 10000 EC2 instance, i have created this bash script to change my EC2 instance type, all instance name and type are stored in a file. the code is working but it is taking so long to run through instance by instance.

does any have knows if i can run AWS Cli command on all EC2 instance in one go ? Thanks :)

#!/bin/bash

my_file='test.txt'

declare -a instanceID
declare -a fmo #Future Instance Size

while IFS=, read -r COL1 COL2; do

   instanceID =("$COL1")
   fmo =("$COL2")   

done <"$my_file"

len=${#instanceID[@]}

for (( i=0; i < $len; i  )); do

   vm_instance_id="${instanceID[$i]}"
   vm_type="${fmo[$i]}"

   echo Stoping $vm_instance_id
   aws ec2 stop-instances --instance-ids $vm_instance_id

   echo " Waiting for $vm_instance_id state to be STOP "
   aws ec2 wait instance-stopped --instance-ids $vm_instance_id


   echo Resizing $vm_instance_id to $vm_type 
   aws ec2 modify-instance-attribute --instance-id $vm_instance_id --instance-type $vm_type
   


   echo Starting $vm_instance_id 
   aws ec2 start-instances --instance-ids $vm_instance_id
    

done

CodePudding user response:

Refactor your code to a function that is passed a line from the file.

work() {
   IFS=, read -r instanceID fmo <<<"$1"
   stuff "$instanceID" "$fmo"
}

Run GNU xargs or GNU parallel for each line of file that calls the exported function. Use -P option run the function in paralell, see documentation.

export -f work
xargs -P0 -t bash -c 'work "$@"' -- <"$my_file"

CodePudding user response:

As @KamilCuk pointed here, you can easily make this run in parallel. However, If you run this script in parallel, you might end up getting throttled by EC2, so make sure you include some backoff retry logic / respect the limits specified here https://docs.aws.amazon.com/AWSEC2/latest/APIReference/throttling.html

  • Related