Home > database >  Oracle cursor open the table using the fetch bulk collect data is slow
Oracle cursor open the table using the fetch bulk collect data is slow

Time:09-25

Everyone a great god, is there a millions of data records in a table, need to read every piece of data, to data processing, using the fetch bulk collect data is slow, is there any way to solve

CodePudding user response:

Using the rowid batch read data, such as processing 1 w data at a time,

CodePudding user response:

reference 1st floor qq646748739 response:
using the rowid batch read data, such as a data processing 1 w,

Weak weak ask, how to use the rowid, can you provide the SQL stored procedure

CodePudding user response:

Give you an example of a batch delete:
 
Declare
Cursor mycursor is the select rowid from table name where access conditions
Type rowid_table_type is the table of the rowid index by pls_integer;
V_rowid rowid_table_type;
The begin
The open mycursor;
Loop
The fetch mycursor bulk collect into v_rowid limit 10000; - every time processing line 10000, that is, per 10000 lines to submit 1
Exit the when v_rowid. Count=0;
Forall I in v_rowid. First.. V_rowid. Last
The delete from table name where the rowid=v_rowid (I);
commit;
End loop;
The close mycursor;
The end;


The rest of the thinking on its own

CodePudding user response:

refer to the original poster wangyou_1987 response:
everyone a great god, is there a millions of data records in a table, need to read every piece of data, to data processing, using the fetch bulk collect data is slow, is there any way to solve it

Best clear reference requirements and scenarios, millions of data, using a cursor, is not a good solution

CodePudding user response:

reference qq646748739 reply: 3/f
give you an example of a batch delete:
 
Declare
Cursor mycursor is the select rowid from table name where access conditions
Type rowid_table_type is the table of the rowid index by pls_integer;
V_rowid rowid_table_type;
The begin
The open mycursor;
Loop
The fetch mycursor bulk collect into v_rowid limit 10000; - every time processing line 10000, that is, per 10000 lines to submit 1
Exit the when v_rowid. Count=0;
Forall I in v_rowid. First.. V_rowid. Last
The delete from table name where the rowid=v_rowid (I);
commit;
End loop;
The close mycursor;
The end;


The rest of the figure

I just according to this method takes, to fetch back time is longer and longer

CodePudding user response:

The
reference 4 floor jdsnhan reply:
Quote: refer to the original poster wangyou_1987 response:

Everyone a great god, is there a millions of data records in a table, need to read every piece of data, to data processing, using the fetch bulk collect data is slow, is there any way to solve it

Best clear reference requirements and scenarios, millions of data, using a cursor, is not a good solution

Erp is put before the current date, all the POS data by writing table a dblink, I here again to read each record, take out the corresponding field, reprocessing, write another table b, at the same time, the current record for the insert table c, to delete records from a table, no cursor, with what method can quickly deal with?
  • Related