I've got a console application that uses an API to get data which is then saved into a csv file in the following format:
Headers:
TicketID,TicketTitle,TicketStatus,CustomerName,TechnicianFullName,TicketResolvedDate
Body:
String values. where TicketResolvedDate
is written as: YYYY-MM-DDTHH:mm:ssZ
Now I want to import this csv file into my mssql express database using the same console application and make sure the TicketID
is imported as a integer datatype and the TicketResolvedDate
as a SQL datetime datatype.
I've made the following code:
List<TicketCSV> tickets = new List<TicketCSV>();
using var reader1 = new StreamReader(OutputClosedTickets);
using var reader2 = new StreamReader(OutputWorkhours);
using var csv1 = new CsvReader((IParser)reader1);
{
csv1.Configuration.Delimiter = ",";
csv1.Configuration.MissingFieldFound = null;
csv1.Configuration.PrepareHeaderForMatch = (string header, int index) => header.ToLower();
csv1.ReadHeader();
while (csv1.Read())
{
var record = new TicketCSV
{
TicketID = csv1.GetField<int>("TicketID"),
TicketTitle = csv1.GetField("TicketTitle"),
TicketStatus = csv1.GetField("TicketStatus"),
CustomerName = csv1.GetField("CustomerName"),
TechnicianFullName = csv1.GetField("TechnicianFullName"),
TicketResolvedDate = SqlDateTime.Parse(csv1.GetField("TicketResolvedDate"))
};
tickets.Add(record);
}
}
using (var bulkCopy = new SqlBulkCopy(connectionString))
{
bulkCopy.DestinationTableName = "GeslotenTickets";
bulkCopy.WriteToServer((IDataReader)csv1);
bulkCopy.DestinationTableName = "WerkUren";
bulkCopy.WriteToServer((IDataReader)reader2);
}
But I'm not sure if this is remotely near the idea i should have to establish this
CodePudding user response:
You're on the right track, but there are a couple issues with your code. You're reading the CSV data into objects, but then passing the CsvReader to the bulk copy operation. At that point all the CSV data in the reader has already been consumed, because you read it all when you were creating objects. Thus the SqlBulkCopy won't see any data in the reader.
The next issue that I think you're going to have is that the "schema" of the data reader needs to match the schema of the target SQL table. If the schemas don't match, you'll typically get some cryptic error message out of the SqlBulkCopy operation, that some type can't be converted.
I maintain a library that I've specifically designed to work well in this scenario: Sylvan.Data.Csv. It allows you to apply a schema to the "untyped" CSV data.
Here is an example of how you could write CSV data to a table in SqlServer:
using Sylvan.Data.Csv;
using System.Data.SqlClient;
static void LoadTableCsv(SqlConnection conn, string tableName, string csvFile)
{
// read the column schema of the target table
var cmd = conn.CreateCommand();
cmd.CommandText = $"select top 0 * from {tableName}"; // beware of sql injection
var reader = cmd.ExecuteReader();
var colSchema = reader.GetColumnSchema();
reader.Close();
// apply the column schema to the csv reader.
var csvSchema = new CsvSchema(colSchema);
var csvOpts = new CsvDataReaderOptions { Schema = csvSchema };
using var csv = CsvDataReader.Create(csvFile, csvOpts);
using var bulkCopy = new SqlBulkCopy(conn);
bulkCopy.DestinationTableName = tableName;
bulkCopy.EnableStreaming = true;
bulkCopy.WriteToServer(csv);
}
You still might encounter errors if the CSV data doesn't correctly match the schema, or has invalid or broken records, but this should work if your csv files are clean and valid.