I am using SpringBoot and Java to copy data from tables in Snowflake
to S3 bucket
.
I am using this code:
"COPY INTO s3://snowflake/" userId " from \"TEST\".\"PUBLIC\".\"USER_TABLE_TEMP\" storage_integration = s3_int file_format = CSV_TEST;";
And it works. I am puting userId as prefix to file.
What it doesn't work is when I try to copy data for the same user, for a user that .csv
file already exists on bucket.
When I try to do it, I am getting this error:
Files already existing at the unload destination: s3://snowflake/1. Use overwrite option to force unloading.
How can I make this work so that the new file overwrites the old one?
CodePudding user response:
You have to use the OVERWRITE-parameter and put it to true within the COPY-command.
Docs: https://docs.snowflake.com/en/sql-reference/sql/copy-into-location.html#copy-options-copyoptions
So the statement is..
COPY INTO ...
file_format = CSV_TEST
OVERWRITE=TRUE;