Home > Blockchain >  Store multiple data frames at the same time
Store multiple data frames at the same time

Time:11-14

I have 4 different data frames and I am now storing these in a bucket in S3. I can do it manually one by one but I would like to store all 4 data frames by running the code just once.

csv_buffer = StringIO()
data_frame.to_csv(csv_buffer)
s3_resource = boto3.resource('s3')
s3_resource.Object(S3_BUCKET_NAME,'bucket_name/file_name.csv').put(Body=csv_buffer.getvalue())

This is what I have as of now. As I mentioned, it works but I have to do it one by one.

CodePudding user response:

You could create a list with your 4 data frames and then iterate through it:

import boto3
from io import StringIO

def write_dataframe_to_csv_on_s3(dataframe, filename, S3_BUCKET_NAME):
    csv_buffer = StringIO()
    dataframe.to_csv(csv_buffer)
    s3_resource = boto3.resource("s3")
    s3_resource.Object(S3_BUCKET_NAME, filename).put(Body=csv_buffer.getvalue())


dataframeList = [df1, df2, df3, df4]
filenameList = ['df1_filename.csv', 'df2_filename.csv', 'df3_filename.csv', 'df4_filename.csv']

for df, filename in zip(dataframeList, filenameList):
    write_dataframe_to_csv_on_s3(df, filename, S3_BUCKET_NAME)

How about doing something like that? Could this work in your case?

  • Related