Home > database >  How to insert the data into snowflake using s3 stage integration
How to insert the data into snowflake using s3 stage integration

Time:09-29

How to insert the data into snowflake using s3 stage integration

create a storage integration between aws and snowfalke.

Create a stage integration between s3 and snowflake.

CodePudding user response:

USE DATABASE Database_name;
USE SCHEMA scema_name;
truncate table table_name_tmp;
COPY INTO table_name_tmp FROM
(
SELECT $1:emp_id::INTEGER,$1:col2::DATE,$1:col3::string
  FROM @Stagename.schema_name.{0}/{1}
)
on_error = 'continue'
file_format = (type = parquet, null_if = ('NULL'), trim_space = true);


MERGE INTO table_name dst USING table_name_tmp srs
    ON (dst.emp_id=srs.emp_id)
    WHEN MATCHED THEN
        UPDATE SET
        dst.emp_id=srs.emp_id,dst.col2=srs.col2,dst.col3=srs.col3
        
         WHEN NOT MATCHED THEN INSERT (emp_id,col2,col3 )
         values
         (srs.emp_id,srs.col2,srs.col3
);
truncate table table_name_tmp;

CodePudding user response:

I think the question could be worded better; you see an s3 storage integration is a METHOD for connecting Snowflake to an external stage, you don't extract from an integration; you still use the external stage to COPY from into Snowflake. The alternate method is to use secrets and keys although Snowflake recommends using the Storage integration because this is a one-time activity and it means you don't have to mess about with those set keys.

S3 by the way is AWS' blob store, step-by-step guide is in the docs, https://docs.snowflake.com/en/user-guide/data-load-s3-config-storage-integration.html

  • Related