Home > Back-end >  A more efficient way of transferring bulk amount of rows from sheet to sheet - Google Apps Script
A more efficient way of transferring bulk amount of rows from sheet to sheet - Google Apps Script

Time:11-11

I am trying to transfer rows of data from Sheet 1 to Sheet 2.

Everyday I assign a few rows in Sheet 1 that I want to transfer to Sheet 2. At the end of the week I will trigger the script to transfer them with this script:

function archive() {

    var lock = LockService.getScriptLock();
    lock.waitLock(10000);

    var ss = SpreadsheetApp.getActiveSpreadsheet();
    
    /* Sheet 1 */

    var firstSheet = ss.getSheetByName("Sheet 1");
    var firstSheetValues = firstSheet.getRange(1, 2, firstSheet.getLastRow(), 1).getValues();

    var moveRows = firstSheetValues.reduce(function(ar, e, i) {
        if (e[0] == "Archive") ar.push(i   1);
        return ar;
        }, []);
    

    /* Sheet 2 */

    var secondSheet = ss.getSheetByName("Sheet 2");
    
    moveRows.forEach(function(e) {
        firstSheet.getRange(e, 1, 1, firstSheet.getLastColumn()).moveTo(secondSheet.getRange(secondSheet.getLastRow()   1, 1));
    });
    
    moveRows.reverse().forEach(function(e) {
        firstSheet.deleteRow(e)
    });

  lock.releaseLock();
  SpreadsheetApp.flush();

}

If "Archive" is written in Column A it will move the rows.

The issue is that if I do 10-20 rows it does them one by one and is very slow and sometimes freezes.

Is there a more efficient way to transfer rows on trigger? I want them all to go at once. The rows being archived are randomly assigned, they could be row 2, 3, 4, 7, 10, 15, 22, 23, 26, etc.

Then in Sheet 2 they should be added to the last row and not overwrite anything

It would be great if this could be done all in one go instead of one by one, since the current script moves the rows and then deletes them one by one

I also want to mention this sheet is shared by a few people and they all have their own sheet, and will be transferring rows to the same shared sheet, so I want that kept in mind

Thank you

CodePudding user response:

I believe your goal is as follows.

  • Your script works fine. But, you want to reduce the process cost of your script.

In this case, how about the following modification?

Modified script:

In this modification, the rows are copied using setValues and deleted by Sheets API. By this, I thought that the process cost might be able to be reduced a little. So, please enable Sheets API at Advanced Google services before you use this script.

function archive() {
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var firstSheet = ss.getSheetByName("Sheet 1");
  var firstSheetValues = firstSheet.getRange(1, 1, firstSheet.getLastRow(), firstSheet.getLastColumn()).getValues();
  var sheetId = firstSheet.getSheetId();
  var {values, requests} = firstSheetValues.reduce(function (o, r, i) {
    if (r[1] == "Archive") {
      o.values.push(r);
      o.requests.push({ deleteDimension: { range: { sheetId, startIndex: i, endIndex: i   1, dimension: "ROWS" } } });
    }
    return o;
  }, { values: [], requests: [] });
  var secondSheet = ss.getSheetByName("Sheet 2");
  secondSheet.getRange(secondSheet.getLastRow()   1, 1, values.length, values[0].length).setValues(values);
  Sheets.Spreadsheets.batchUpdate({requests: requests.reverse()}, ss.getId());
}

References:

  • Related