I have an question Here. When am trying to load from S3 bucket it was loading old files also in S3 bucket. Is there any current option is there in copy command Ex, copy into table from @table_json. Is there any option is there to archive old data in S3 bucket in snowflake
Pls help if there is an options
CodePudding user response:
No, there is no option to move files to another path after they have been loaded using a COPY statement.
Other options to consider
- load files directly into the S3 archive location, set up Lambda functions to copy new files to a dedicated "load" location. When running the COPY function in Snowflake specify the PURGE option to delete files in the "load" location after they have been loaded.
- use Snowpipe to automatically load new files
CodePudding user response:
If I understand clearly you want to archive them to a different folder after COPY statement is executed.
I can think of this option.
- You have to set up access logging for your S3 bucket used for COPY command.
- Then set up a Lambda trigger event on the Access logging buckets' object creation process
- It will not be real time but that is what we want.
Hope this helps. Just an idea from my side. Please let me know
Is there way to invoke a lambda function if an amazon s3 object is accessed?