Home > Software engineering >  How to upload an image file from an s3 bucket to cloudinary (nodejs)
How to upload an image file from an s3 bucket to cloudinary (nodejs)

Time:12-21

I have image files saved in an s3 bucket. I want to update the images using Cloudinary. What's the best way to get images out of S3 and into Cloudinary?

I can get a readable stream using the aws-sdk in nodeJS:

// Create service client module using ES6 syntax.
import { S3Client, GetObjectCommand } from "@aws-sdk/client-s3";
// Set the AWS Region.
const REGION = "eu-west-2";
// Create an Amazon S3 service client object.
const s3Client = new S3Client({ region: REGION });

// Set the parameters.
export const bucketParams = {
  Bucket: "mybucketname",
};

// Get an object from S3 bucket
export async function getS3Object(inputParams: { Key: string }) {
  try {
    const data = await s3Client.send(
      new GetObjectCommand({
        ...bucketParams,
        Key: `public/dalle/${inputParams.Key}`,
      })
    );

    return data; // data.Body is a readable stream
  } catch (err) {
    console.log("Error", err);
  }
}

Uploading to cloudinary can be done by passing an image url:

import { v2 } from "cloudinary";
const cloudinary = v2;

// Return "https" URLs by setting secure: true
cloudinary.config({
  secure: true,
  cloud_name: myCloudName,
  api_key: myApiKey,
  api_secret: myApiSecret,
});

export async function uploadImage(fileLocation: string){
  const newUploadUrl = await cloudinary.uploader.upload(fileLocation, {});
  console.log({
    newUploadUrl,
  });
}

Is there a way for Cloudinary to accept a readable stream for upload? Or alternatively, is there a way to get a public image URL from S3? (Or is there a better way to do this entirely?)

CodePudding user response:

Rather than downloading the image from S3, we can create a temporary URL to send to cloudinary:

// Create service client module using ES6 syntax.
import { S3Client, GetObjectCommand } from "@aws-sdk/client-s3";
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";

// Set the AWS Region.
const REGION = "eu-west-2";
// Create an Amazon S3 service client object.
const s3Client = new S3Client({ region: REGION });

// Set the parameters.
export const bucketParams = {
  Bucket: myBucketName,
};

export async function getTempSignedUrl(inputParams: { Key: string }) {
  try {
    const command = new GetObjectCommand({
      ...bucketParams,
      Key: inputParams.Key,
    });
    const data = await getSignedUrl(s3Client, command, {});
    console.log({
      data,
    });
    return data; // Can be passed to cloudinary as per the upload function in the question
  } catch (err) {
    console.log("Error", err);
  }
}

Source https://docs.aws.amazon.com/sdk-for-javascript/v3/developer-guide/s3-example-creating-buckets.html#s3-create-presigendurl

CodePudding user response:

Please make sure there's either a public access to the S3 URLs or assign read permissions to our AWS user. For more information: http://support.cloudinary.com/hc/en-us/articles/203276521-How-do-I-allow-Cloudinary-to-read-from-my-private-S3-bucket-

  • Related