Home > Mobile >  Failed insert in db by ExecuteSqlInterpolatedAsync EF Core
Failed insert in db by ExecuteSqlInterpolatedAsync EF Core

Time:09-23

I have trouble with inserting in psql db by ef core method ExecuteSqlInterpolatedAsync/ExecuteSqlInterpolated. TableScheme:

      Column  |  Type   | Collation | Nullable |                 Default                  
---------- --------- ----------- ---------- ------------------------------------------
 Id       | integer |           | not null | nextval('"LookupRows_Id_seq"'::regclass)
 Index    | integer |           | not null | 
 RowData  | text    |           | not null | 
 LookupId | integer |           | not null | 

Insert method:

FormattableString str = $"INSERT into \"LookupRows\" (\"Index\", \"RowData\", \"LookupId\") VALUES {q.First()};";
await _context.Database.ExecuteSqlRawAsync(str.ToString(),cancellationToken);
await _context.Database.ExecuteSqlInterpolatedAsync($"INSERT into \"LookupRows\" (\"Index\", \"RowData\", \"LookupId\") VALUES {q.First()};",cancellationToken);

q.First() = (0, '{{"0":["0","Bucharest","0"]}}', 115)

ExecuteSqlRawAsync works fine. Entry successfully insert. But ExecuteSqlInterpolatedAsync always return an error: MessageText: syntax error at or near "$1" I ran out of ideas. What am i doing wrong?

CodePudding user response:

Neither option works.

In the first case the code is using plain old string manipulation to construct a SQL query, allowing SQL injection and conversion errors. If that q.First() is a string whose value is ('','',''); drop table Users;--, you'd end up with a dropped table.

In the second case the syntax is simply invalid. Instead of supplying the 3 values expected through parameters, a single value is used.

The correct syntax for ExecuteSqlRawAsync is :

var sql=@"INSERT into LookupRows (Index, RowData, LookupId) 
VALUES (@index,@row,@lookup);"

var paramObject=new {index=1,row="abc",lookup=100};
await _context.Database.ExecuteSqlRawAsync(sql, paramObject, cancellationToken);

The parameter object's properties must match the parameter names.

The correct syntax for ExecuteSqlInterpolatedAsync is:

await _context.Database.ExecuteSqlInterpolatedAsync(
    @$"INSERT into LookupRows (Index, RowData, LookupId) 
VALUES ({paramObject.index},{paramObject.row},{paramObject.lookup});", 
cancellationToken);

or

FormattableString sql=@$"INSERT into LookupRows (Index, RowData, LookupId) 
VALUES ({paramObject.index},{paramObject.row},{paramOject.lookup});";

await _context.Database.ExecuteSqlInterpolatedAsync(sql, cancellationToken);

ExecuteSqlInterpolatedAsync will inspect the FormattableString and generate a new parameterized query using the string placeholders as positional parameters and their values as parameter values. It's the equivalent of:

var sql=@"INSERT into LookupRows (Index, RowData, LookupId) 
VALUES (?,?,?);"

await _context.Database.ExecuteSqlRawAsync(sql, 1,"abc",100);

Using ExecuteSqlInterpolatedAsync is rather risky because it's way too easy to forget to explicitly specify FormattableString and end up with :

var sql=@$"INSERT into LookupRows (Index, RowData, LookupId) 
VALUES ({paramObject.index},{paramObject.row},{paramObject.lookup});";

Which is just a string constructed from data, and once again vulnerable to SQL injection

Inserting 100K items

Looks like the actual problem is inserting 100K rows. ORMs are the wrong tool for this job. In this case, instead of a single graph of objects there are 100K rows with no business logic.

Executing 100K INSERT statements will take a long time and flood the database's transaction log. The solution is to use SqlBulkCopy to insert the rows using bulk operations and minimal logging.

SqlBulkCopy.WriteToServer expects a DataTable or DataReader. To use an IEnumerable<T> we can use FastMember's ObjectReader to wrap it:


IEnumerable<SomeType> data = ... 


using(var bcp = new SqlBulkCopy(connection)) ;
using(var reader = ObjectReader.Create(data, "Id", "Name", "Description")) 
{ 
  bcp.DestinationTableName = "SomeTable"; 
  bcp.WriteToServer(reader); 
}

Importing CSVs

To import CSV files one can use CsvHelper's CsvDataReader :

using (var reader = new StreamReader("path\\to\\file.csv"))
using (var csv = new CsvReader(reader, CultureInfo.InvariantCulture))
{
    // Do any configuration to `CsvReader` before creating CsvDataReader.
    using (var dr = new CsvDataReader(csv))
    using(var bcp = new SqlBulkCopy(connection))
    {       
           
      bcp.DestinationTableName = "SomeTable"; 
      bcp.WriteToServer(dr); 
    }
}
  • Related