So I am using .csv files for bank statement records that I want to import into mySQL using Laravel, however I want to validate the records to check for duplications. I have created a "checkrow" field below which is a combination of the fields from the .csv file.
while (($record = fgetcsv($input_file, 1000, ",")) !== FALSE)
{
if (!$heading)
{
$product = array(
"checkrow" => "ABSA" . $record['0'] . $record['1'] . $record['2'] . $record['3'],
"bankname" => "ABSA",
"accountnumber" => "123456",
"trnDate" => $record['0'],
"description" => $record['1'] ,
"reference" => self::getAbsaReference($record['1'] ) ,
"amount" => $record['2']
);
print_r ($product);
BankStatement::create($product);
}
$heading = false;
}
When I run the "BankStatement::create($product); " to update table, I have a row in the mysql table called "checkrow" (which updates from the 'checkrow' field above) which is unique in mysql (does not allow duplicates), however when I run this code it gives a "Integrity constraint violation" which is exactly what its supposed to be doing.
However, I would like it to skip this row in the .csv file and move onto the next to insert to mysql when it identifies a duplicate.
CodePudding user response:
You can use updateOrCreate()
BankStatement::updateOrCreate([
'checkrow' => "ABSA" . $record['0'] . $record['1'] . $record['2'] . $record['3'],
], [
"bankname" => "ABSA",
"accountnumber" => "123456",
"trnDate" => $record['0'],
"description" => $record['1'] ,
"reference" => self::getAbsaReference($record['1'] ) ,
"amount" => $record['2']
]);
As well alternative version
upsert()
which introduced in Laravel 8