Home > Software design >  How to batch array_map?
How to batch array_map?

Time:12-09

I have this code that combines 3 arrays using array_map() to store them to a csv file to the format I like:

$list = array_map(null, $array1, $array2, $array3);
$filename = ‘file.csv’;
$fp = fopen($filename, ‘w’);
foreach ($list as $fields){
 fputcsv($fp,$fields);
}
fclose($fp);

The problem is the array contains millions of rows so whenever I run my script, It always throws a Fatal Error: Allowed memory exhausted.

Is there a way to batch the storing of array_map() to the variable $list?

I tried finding a solution for this online for a few days now and all the solutions I found is not compatible with my code. I’m basically on my last straw here and I’m open on trying any idea you guys could come up with!

CodePudding user response:

  1. Get a more memory for PHP process.
  2. It's possible to process the arrays in chunks to reduce the amount of memory used at any given time. Here is one way you could do this:

Create a function that will process a chunk of the arrays and write the resulting rows to the CSV file. The function should take three arguments: the arrays to process, the start index, and the number of items to process.

    function processChunk($array1, $array2, $array3, $start, $numItems) {
      $filename = ‘file.csv’;
      $fp = fopen($filename, ‘w’);
    
      // Process the chunk of arrays, starting at the specified index
      // and processing up to $numItems items.
      for ($i = $start; $i < $start   $numItems && $i < count($array1); $i  ) {
        $fields = array_map(null, $array1[$i], $array2[$i], $array3[$i]);
        fputcsv($fp, $fields);
      }
    
      fclose($fp);
    }

Use a loop to process the arrays in chunks. For example, you could process 1000 items at a time. Inside the loop, call the processChunk() function to process each chunk of the arrays.

    $chunkSize = 1000;
    for ($i = 0; $i < count($array1); $i  = $chunkSize) {
      processChunk($array1, $array2, $array3, $i, $chunkSize);
    }

This should allow you to process the arrays in smaller chunks, which should reduce the amount of memory used. You can use for this cron.

CodePudding user response:

Assuming they all have the same number of elements, loop over the first and add the rest into the write without having to build a new array with all of the values...

$filename = ‘file.csv’;
$fp = fopen($filename, ‘w’);
foreach ($array1 as $key => $value1){
    fputcsv($fp, [$value1, $array2[$key], $array3[$key]]);
}
fclose($fp);
  •  Tags:  
  • php
  • Related