Home > Net >  Medium level of data updating efficiency problem
Medium level of data updating efficiency problem

Time:10-07

Data from one thousand or so, stored in a List, how to batch Update to the database, the efficiency will be higher??

Circulation List, a single execution of the Update is certainly not too fast efficiency,
Combination of larger SQL execution as if also is not a good idea,

CodePudding user response:

First article one thousand data stored in the List, it is not the right thing to do, when so used, and to save to update the database only cycle

CodePudding user response:

Article one thousand the data efficiency should haven't much room to improve, how to save difference should not

CodePudding user response:

The
refer to the original poster lovingkiss response:
...
Circulation List, a single execution of the Update is certainly not too fast efficiency,
Combination of larger SQLzhuh execution as if also is not a good idea,

A single execution data round-trip delay,
Portfolio has long SQL, the limitation of length,

Can have better efficiency with batch updates,
If you use MSSQL, can use SqlBulkCopy,
If you use oracle, can use OracleBulkCopy,

Suppose you have a Users table under the MSSQL of
 CREATE TABLE [dbo] [Users] (
[Id] INT the NOT NULL,
[Name] NCHAR (32) NULL
);


Can use SqlBulkCopy:
 
Var connectionStr="Data Source=... ; Initial Catalog=... ; Integrated Security=True ";

//put 1500 presentation data
Var batchData=https://bbs.csdn.net/topics/new DataTable (" Users ");
BatchData. Columns. The Add (" Id ");
BatchData. Columns. The Add (" Name ");
for(int i=0; I & lt; 1500; I++)
{
BatchData. Rows. The Add (I, "the name" + I);
}

Using (var con=new SqlConnection (connectionStr))
{
con.Open();
Var SBC=new SqlBulkCopy (con)
{
DestinationTableName="dbo. The Users",
BatchSize=1000,
};
SBC. The WriteToServer (batchData);
}


CodePudding user response:

See you this title, I thought the old comrade p and new things

CodePudding user response:

At the time of you with these questions, you should go to see the PHP, go, python, Java is how to play, the somebody else not with hello,

So, don't have to be so entangled with, what should to do, the more entangled with these more futile

CodePudding user response:

The List to Array, into an Array

CodePudding user response:

SQL Server can use table variables, using the stored procedure update

CodePudding user response:

If turn the datatable can use sqlbulkcopy,
But the List datatable, also unhappy, but article one thousand data as if also can also

CodePudding user response:

3 floor plan is no problem, if it is oracle can define plain SQL insert, parameters passed in array, fast

CodePudding user response:

Batch operation must be preferred SqlBulkCopy

But your data source is a list so you need to convert the datatable can then perform the WriteToServer

In this case also need to transform I think it is better to direct cycle list executing SQL statements.

Such as an executive article 10 1000 is executed 100 times..

But there would be thousands of little problem..

Mysql insert () () () a second easily also insert ten thousand...

Thousands of data is in an instant, I think you don't need to consider the efficiency..

CodePudding user response:

This problem more classic, actually is not above a few said, then, successive implementation and batch SQL execution, efficiency is quite different from (pro) measurement, more effective method is (used as an example) :
1, to save the data to the DataTable
2, open transactions, table variables
3, using BulkCopy Datatable data is inserted into a table variable
4, the use of SQL statement batch processing table variable data
5, to commit the transaction
This method than detailed processing, speed will provide several times or even dozens of times,

One problem is that interact with the database too many times, each time interaction costs account for a big head,

CodePudding user response:

references to the tenth floor by_ love reply:
batch operation must be preferred SqlBulkCopy

But your data source is a list so you need to convert the datatable can then perform the WriteToServer

In this case also need to transform I think it is better to direct cycle list executing SQL statements.

Such as an executive article 10 1000 is executed 100 times..

But there would be thousands of little problem..

Mysql insert () () () a second easily also insert ten thousand...

Thousands of data is in an instant, I think you don't need to consider the efficiency..


You said is the Insert, Update, I am considering,,

CodePudding user response:

refer to 12 floor lovingkiss reply:
...
You said is the Insert, Update, I am considering,,


The principle,
First batch insert into a temporary table,
Then Merge, (or call yourself write stored procedure),

MERGE (Transact - SQL) reference and example:
https://docs.microsoft.com/zh-cn/sql/t-sql/statements/merge-transact-sql
  •  Tags:  
  • C#
  • Related