Home > Net >  AWS service for doing jobs
AWS service for doing jobs

Time:09-22

I have the following need - the code needs to call some APIs, get some data, and store them in a database (flat file will do for our purpose). As the APIs give access to a huge number of records, we want to split it into 30 parts, each part scraping a certain section of the data from the APIs. We want these 30 scrapers to run in 30 different machines - and for that, we have got a Python program that does the following:

  1. Call the API, get the data, based on parameters (which part of the API to call)
  2. Dump it to the local flatfile.

And then later, we will merge the output from the 30 files into one giant DB. Question is - which AWS tool to use for our purpose? We can use EC2 instance, but we have to keep the EC2 console open on our desktop where we connect to it to run the Python program, it is not feasible to keep 30 connections open on my laptop. It is very complicated to get remote desktop on those machines, so logging there, starting the job and then disconnecting - this is also not feasible.

What we want is this - start the tasks (one each on 30 machines), let them run and finish by themselves, and if possible notify me (or I can myself check for health periodically).

Can anyone guide me which AWS tool suits our purpose, and how?

CodePudding user response:

"We can use EC2 instance, but we have to keep the EC2 console open on our desktop where we connect to it to run the Python program"

That just means you are running the script wrong, and you need to look into running it as a service.

In general you need to look into queueing up these tasks in SQS and then triggering either EC2 auto-scaling or Lambda functions depending on if your script will run inside the Lambda runtime restrictions.

CodePudding user response:

This seems like a good application for Step Functions. Step Functions allow you to orchestrate multiple lambda functions, Glue jobs, and other services into a business process. You could write lambda functions that call the API endpoints and store the results in S3. Once all the data is gathered, your step function could trigger a lambda function, glue job, or something else that processes the data into your database. Step Functions help with error handling and retry and allow easy monitoring of your process.

  • Related