Recently I had written script which read data from csv and do some check, as csv data is huge i want to run it in batches so first 50 lines and execute them and write it to one folder and then execute next 50 lines and write output in another folder
below is the line i used to import and export csv file
$P = Import-Csv -Path .\Processes.csv
and export using
Export-Csv -Path "Data"
CodePudding user response:
TotalCount
is the best way to deal these scenarios. Instead of importing the csv, my recommendation would be to use Get-content
and pick the necessary lines required:
Get-Content .\Processes.csv -TotalCount 50 | Out-File .\Processes_first50.csv
Another recommendation would be to use pipeline and then Select -First
Get-Content .\Processes.csv | select -First 50 | Out-File .\Processes_first50.csv
The last option is to use the -head
parameter:
Get-Content .\Processes.csv -Head 50 > .\Processes_first50.csv
The >
is the redirecting the output to a file which is similar to outfile but much more elegant.
Hope it helps.
CodePudding user response:
To compliment the helpful answer from Ranadip Dutta and answer the question in the comment: "how should make sure my next count is next 50 and so on as records may be 200?"
You might use this Create-Batch
function, see also: Slice a PowerShell array into groups of smaller arrays:
Install-Script -Name Create-Batch
Example:
$BatchNr = 1
Import-Csv -Path .\Processes.csv |Create-Batch 50 |ForEach-Object {
$_ |ForEach-Object {
$_ # do something with each item in the batch of 50
} |Export-Csv ".\Batch$BatchNr.csv"
$BatchNr
}