Production Database Tables grown up extremely huge so there is requirment to Keep only 2 year data and delete all previous data from Production database and moved it to Archival Database. So Archival Data will have huge Data and in production Database I will keep only 2 years data. E.G. 10 years data, 8 Years data in Archival DB and 2 years data in Production DB. Problems are
- Most of Prod DB do not have Index so taking high time to data retrieval.
- What would be the Best way to Fetch such huge data in Delete e.g. 10 years transactions 500GB data from 1 Table and Delete is on basis of Date Column?
- How to use Bulk Collect, FOR ALL to optimise the Deletion?
Regards, Yogesh
CodePudding user response:
You can try something like this:
create table new_table unrecoverable as select * from old_table where ....;
drop table old_table;
rename new_table to old_table;
create index old_table_idx1 on old_table(c1,c2) unrecoverable parallel 5;
.....
NO log on that, just move the data to a new table, drop/rename old/new, create indexes as fast as possible without log.
In addition, once you remove the unwanted data you may want to consider PARTITION your table on the date column THEN you can easily drop PARTITIONs once their RETENTION PERIOD is met
CodePudding user response:
I'd suggest partitioning both production table and archive table by day/week/month and periodically moving partitions from the production table to the archive table. That is an instant process, no new rows are created. Here is a blog describing your use case.