Built in oracle table space, table, using c # write a batch import tools,
Started using query, essentially a table data and then joining together of SQL execution, met clob behind these types of fields inside put the pictures these strings are very long, to generate a SQL hundreds of KB, as a result of the limitation of oracle can't perform, it's no use this way,
Behind the OracleParameter this way, because of the need to bulk import, plus is oracle10g can only use odpnet, it encountered a problem, import a table there are hundreds of thousands of data, only a few seconds, but only writes data, there are several tables in several fields, built a primary key, import very slow a batch import only 500 with close to 3 minutes, so all import table needs more than an hour in the past, if the batch 1000 tables will be locked,
Excuse me somebody know this how to solve?
CodePudding user response:
Don't understand the development, the development part of the past not to mentionThe database part:
1, the primary key is a unique index is required to ensure the uniqueness of the data, so there is an index of the inserted into the natural will be slower than no index, index, the more the performance will be more obvious,
Article 2, I don't know batch submitted 1000 will lock table is what mean? Multithreaded insert? Usually DML operations, including insert will lock table, but this time on the table and the lock is the 3 # queue lock level is not high, 3 # are compatible between lock, that is, a certain form can allow multiple conversations at the same time operating table data, as long as is not designed to operate multiple sessions a row, you won't get lock waits, I think it should be under investigation into 1000 records, what's your session after waiting, can get the SID, query the v $session. The event field, only used for further investigation,
CodePudding user response:
1, if there is a trigger table, if you have to check the logic,2, this form of the foreign key constraints, mainly it reference other table's foreign key,
3, delete it and re-create (note that the existing backup data,