I'm a novice and I'm pretty sure I haven't set up the for-loop correctly in my function "importData":
function urlsToSheets(){
importData("https://hub.arcgis.com/datasets/d3cd48afaacd4913b923fd98c6591276_36.csv", "Pavement Condition");
importData("https://hub.arcgis.com/datasets/lahub::tctmc-streets-of-significance-construction-impacted-streets.csv", "Streets of Significance");
importData("https://geohub.lacity.org/datasets/lahub::one-year-moratorium-streets.csv", "One-Year Moratorium");
importData("https://hub.arcgis.com/datasets/lahub::boe-permits-lines.csv", "BOE Permit Lines")
importData("https://hub.arcgis.com/datasets/lahub::archived-boe-permits-lines.csv", "Archived BOE Permit Lines");
importData("https://hub.arcgis.com/datasets/lahub::boe-permits-points.csv", "BOE Permit Points");
importData("https://hub.arcgis.com/datasets/56318ef6ed6444d981977adf80157b87_5.csv","Archived BOE Permit Points");
}
function importData(url, sheetName){
const file = UrlFetchApp.fetch(url);
const csv = file.getBlob().getDataAsString();
const csvData = csvToArray(csv);
const sheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName(sheetName);
sheet.clear();
for (let i = 0; i <= csvData.length; i ){
sheet.getRange(1, 1, csvData.length, csvData[i].length).setValues(csvData);
}
}
Each of these URLs have thousands of rows; the first URL has 85k rows of data. It takes over 10 minutes for the csvData to transfer to its intended sheet.
What am I doing wrong? How can I make this quicker?
Thank you!
CodePudding user response:
Instead of setting the data for each row, set values of a double dimension matrix. The less interactions you do using the SpeadsheetApp API, the faster your code gets.
I suggest creating an array of all of your data. Then selecting the rows and setting values in separate lines (out of for loop).
CodePudding user response:
Thank you everyone! The simplest solution was to take sheet.getRange(1, 1, csvData.length, csvData[0].length).setValues(csvData) out of the for-loop and to make it its own line.
Thank you!