I want to export my data as csv file. For that I'm using CsvHelper library. But I don't want to have all data in one csv file. The limitation should be 1000 per file.
What I tried for that limitation:
var fileStream = new FileStream("/static/export.csv", FileMode.Create, FileAccess.Write)
var memoryStream = new MemoryStream();
var stream = new StreamWriter();
var writer = new CsvWriter(stream, config);
for(i = 0; i < indexCount; i = 1000)
{
var items = result.Skip(i).Take(1000);
.
.// logic for writing records with CsvHelper
.
writer.Flush();
memoryStream.Position = 0;
byte[] data = memoryStream.ToArray();
fileStream.Write(data, 0, data.Length);
}
For example if I have 100000 rows in database, I want to have 100 csv files. How can I download chunk files from stream?
CodePudding user response:
var i = 1;
foreach (var chunk in result.Chunk(1000))
{
using var fileStream = new FileStream($"/static/export{i }.csv", FileMode.Create, FileAccess.Write);
using var streamWriter = new StreamWriter(fileStream);
var writer = new CsvWriter(streamWriter , config);
// logic for writing records with CsvHelper
}