Home > other >  Using Python pysftp download files from SFTP and store directly in AWS S3
Using Python pysftp download files from SFTP and store directly in AWS S3

Time:04-22

Using Python trying to download files from SFTP using pysftp module and save the files directly in S3 bucket. I should not save the files in my local. They should be directly saved into S3. I could not find an option to put the files directly in s3.

I am using using private SSH keys instead of password to connect SFTP. Secrets are stored in AWS secret manager. Please help.

import boto3
import json

def get_secret():
  client = boto3.client("secretsmanager")
  secret_name = "arn:name"
  region_name = "region"
  response = client.get_secret_value(SecretId=secret_name)
  secret_dict = json.loads(response['SecretString'])
  return secret_dict 

sftp_secret = get_secret()
host=sftp_secret["host"]
username=sftp_secret["username"]
private_key=sftp_secret["private_key"]

with pysftp.Connection(host=host, username=username, private_key=private_key) as sftp:
    print("connection established")
    sftp.cwd('/')
    list_dir = sftp.listdir_attr()
    sftp.get(remote_path, local_path)

CodePudding user response:

First, do not use pysftp, it's a dead project. See pysftp vs. Paramiko.

Use Paramiko instead:
Transfer file from SFTP to S3 using Paramiko

And even if you decide to stick with the pysftp, the solution is basically the same. The pysftp Connection.open works the same way as Paramiko SFTPClient.open.

  • Related