Datawindow can retrieve million level data??
Was meant to customer system upgrade, the original is orcle data, is now a SQL database,, and different business tables inside, so I need to take oracl data in the data window, and then line processing data, write SQL database,
Due to handling, likely in oralc is a table, after processing need to assign to essentially three table,
Is there a better way??
CodePudding user response:
Datawindow can retrieve million level data??
There is no problem when processing data,
This kind of problem, need to use the data extraction pipeline, set up conditions, implement pipe to smoke
CodePudding user response:
refer to the original poster ducklyxh response: datawindow can retrieve million level data?? Was meant to customer system upgrade, the original is orcle data, is now a SQL database,, and different business tables inside, so I need to take oracl data in the data window, and then line processing data, write SQL database, Due to handling, likely in oralc is a table, after processing need to assign to essentially three table, Is there a better way?? Can handle millions of data, but suggested that batch is better, such as take thousands, can consider to use oralce rownum to achieve CodePudding user response:
Rownum is oracle preprocessing field, the default label sequence is 1, only after the record set has been meet the conditions for subsequent number, due to the first record rownum default is 1, and your condition is rownum>=6 is it for the first record rownum is definitely not greater than 6 so oracle abandon the first record does not meet the conditions in article 2 of the database record the order of 1 to compare certainly didn't meet rownum>=6 this cycle that is because there is no record meet rownum>=6 so the record has been abandoned, rownum is always 1, Methods: Select * from ( Select a1. *, rownum RWN from emp a1 where rownum & lt;=10 ) where RWN & gt;=6; Or Select * from ( The select qx. *, row_number () over (order by qx, empno) RWN from emp qx ) where RWN between 6 and 10 CodePudding user response:
To do with pb data pipeline CodePudding user response:
Partial or using data pipeline, don't a retrieve down processing, efficiency is not high, and the error will all over again, to write a temporary table, after all OK to used to directly insert CodePudding user response:
SQL Server has a data import and export tools,,, Method a lot of,,, CodePudding user response:
refer to 6th floor wag_enu response: SQL Server has a data import and export tools,,, Method a lot of,,, To import the data to SQL data, to data processing, CodePudding user response:
CodePudding user response:
SQL Server has a data import and export tools,,, Method a lot of,,, CodePudding user response:
Data volume is too big, must be handled separately, to speed up the efficiency and accuracy, using a temporary table backup safer CodePudding user response:
Data volume is too big, must be handled separately, to speed up the efficiency and accuracy, using a temporary table backup safer CodePudding user response:
Proposal with import and export tools, or connected with the database