is there any s3 job/functionality to move all files that have been there for more than 10 days to another folder automatically (using files and folders for simplicity instead of objects)? or that have not been modified for more than 10 days? my purpose is making a request using an sdk and retreving the files that have been created in the last 10 days without deleting the others, just move them to a different fodler
CodePudding user response:
You can use Amazon S3 Lifecycles.
Please note, in this scenario, you usually move your objects to a different bucket.
Managing object lifecycle
Define S3 Lifecycle configuration rules for objects that have a well-defined lifecycle. For example:
If you upload periodic logs to a bucket, your application might need them for a week or a month. After that, you might want to delete them.
Some documents are frequently accessed for a limited period of time. After that, they are infrequently accessed. At some point, you might not need real-time access to them, but your organization or regulations might require you to archive them for a specific period. After that, you can delete them.
You might upload some types of data to Amazon S3 primarily for archival purposes. For example, you might archive digital media, financial and healthcare records, raw genomics sequence data, long-term database backups, and data that must be retained for regulatory compliance.
With S3 Lifecycle configuration rules, you can tell Amazon S3 to transition objects to less-expensive storage classes, or archive or delete them.
CodePudding user response:
I would suggest you consider using AWS Step Functions for this. You can use implement the following workflow:
- Use the S3 event to trigger the Step Function workflow. Information on that available here
- Use the
Wait
state within Step Functions to pause for 10 days (the maximum is one year). Information available here - After this, you can trigger a Lambda function that will move the object to a new folder in S3.
I would suggest that you move object to a different S3 bucket rather than a different folder. This is because you want to avoid a loop where the movement of your object triggers another Step Functions workflow. You can limit the object prefix on the event rule, but it is safer not to worry about this.