-1

We are having a Google App Maker app using Google Cloud SQL tables. Our location is Prague, Central Europe and app.saveRecords() takes incredible 240ms at minimum. We have tried many things including changing the location of the instance but it did not solve the problem.

Fortunately, the solution is to insert new records in batches. This was brilliantly solved by the answer in my previous thread [ Google App Maker saving records to DB is taking 240ms per record ].

We would be so grateful if anyone provided us with a simple code to update records in the DB in batches [server-side scripting]. This way, we not only could insert completely new records in batches, but we could update the data of the records already saved in the DB fast way.

Let's say we have a DB with 3 custom fields:

product code | product availability | price

+ an example array with the data:

ourData[0] = ['pc001','in stock','99'];
ourData[1] = ['pc002','out of stock','49'];
ourData[2] = ['pc003','out of stock','199'];
ourData[3] = ['pc004','in stock','149'];
ourData[4] = ['pc005','in stock','299'];

...and we want to assign the availability and price data to the particular DB line using the key "product code". If the product code is not found in the DB, then we insert a new record to the DB.

Multiple single line inserts/updates take Google App Maker users in the EU too long because the query does go to the US even if the SQL instance is located in the EU. Running this in batches could break today's Google App Maker limitations outside the US. Thank you so much for any help to make Google App Maker a great tool for beginners all around the world.

Morfinismo
  • 4,985
  • 4
  • 19
  • 36
  • 1.Where is `ourData` coming from? 2.Why do you need this to be fast? – TheMaster Apr 02 '19 at 12:13
  • Blocking the UI on an upload / long-term task is not a great design pattern. – tehhowch Apr 02 '19 at 12:15
  • As far as the connection mechanics, take that up with your GSuite support contact. Being referred to volunteers for what looks like an engineering issue in their implementation is not something you should allow. – tehhowch Apr 02 '19 at 12:17
  • Thank you for your effort. 1. ourData is just an example array, I tried to make the array data as clear as possible. 2. We have an array with 20000 lines which needs to be imported in the DB and it would take over an hour not using batches. – Daniel Kacko Apr 02 '19 at 12:17
  • @Daniel 2.So what if takes a hour? Let it import at midnight or non-working hours... What exactly is the problem? 1. You still haven't said where it comes from. – TheMaster Apr 02 '19 at 12:20
  • 3. `app.SaveRecords` does just that(update/insert depending on `_key`). The previous solution should be enough,if your primary key is configured properly. – TheMaster Apr 02 '19 at 12:24
  • @tehhowch Thanks! I am filing it. However, the solution is to be found in the simple code anyways. I am afraid, it will take them months to change such an engineering issue. – Daniel Kacko Apr 02 '19 at 12:27
  • @TheMaster 2. We need to import many suppliers' XML data into the DB several times a day (6-16 times per one supplier). Thus, finish the saving within a few minutes is indeed in need. 1. The arrays come from their XML's, Google Spreadsheets etc. 3. Thanks! I am looking at it. – Daniel Kacko Apr 02 '19 at 12:30
  • Can you show some code? So you're polling Google spreadsheet from appmaker every few minutes for data? And if present, updating that in the database? How did you determine 240ms for 1 record save? – TheMaster Apr 02 '19 at 12:55
  • @TheMaster The code is shown here: https://stackoverflow.com/questions/55419294/google-app-maker-saving-records-to-db-is-taking-240ms-per-record Parsing Spreadsheets, XMLs using Google-App-Maker is super fast. We determined 240ms for 1 saved record using the simple code mentioned in the previous thread. We tried preview/deployment, and changing instance's locations or configuration. It's all about the ping. – Daniel Kacko Apr 02 '19 at 13:11
  • Another option is jdbc. Create a batch of statements and execute them as a batch. – TheMaster Apr 02 '19 at 13:18
  • @TheMaster thanks! I am going to try app.SaveRecords depending on _key first and see. – Daniel Kacko Apr 02 '19 at 13:25
  • It'll automatically depend on key. You just need to set product code field as the primary key – TheMaster Apr 02 '19 at 13:26

2 Answers2

2

With the help of the others here in the thread, I managed to reduce the time to insert or update a record to 50ms. Furthermore, I have tested out to insert/update 10000 records to a table containing 60 fields. In this case, the speed was 60ms per record. Hereby, I am providing anyone (who may find it useful) with my final code. Thank you for all the helpful answers and comments!

/* We use Google Cloud SQL Table "testDB" that contains these fields:
   - Id (the default primary key for DBs in Google App Maker)
   - productCode
   - availability
   - price

   We have an array called "data" which contains the data [productCode, availability, price] we want to save to the database.
*/

function saveDataNew(){
    var data = [];
    data[0] = ['pc1','in stock','99'];
    data[1] = ['pc2','out of stock','129'];
    data[2] = ['pc3','in stock','78'];
    data[3] = ['pc4','in stock','95'];

    //load all the records in the DB (no filter is used)
    var query = app.models.testDB.newQuery();
    var records = query.run();
    console.log("Found " + records.length + " records in the database.");

    //create an array to help us find DB's Id for each unique productCode.
    var helpingArray = [];
    for (var x in records) {
      var product = records[x]; //get each record in the DB
      helpingArray.push([product.productCode,product.Id]); //save this record's productCode and Id into the helping array
    }

    //setting up the writing in batches
    var totalRecords = data.length;
    var batchLimit = 500; //size of a batch round - number of lines for each app.saveRecords();
    var roundlimit;
    if(totalRecords < batchLimit){
      roundlimit = totalRecords;
    }
    else{
      roundlimit = batchLimit;     
    }
    var totalRounds = Math.ceil(totalRecords / batchLimit);
    var round = 1;
    var currentItem = 0;

    //start writing in batches
    do{ 
        var recordsToSave = [];
        //create or update a record in the DB for each line of our data within one batch round
        for(var i=currentItem; i<roundlimit; i++){
          var wantedCode = data[i][0]; //get the productCode of the current line of the data array
          var orderNum = -1; //create a variable to find the order number of the productCode in the helping array
          for(var z=0; z<helpingArray.length; z++){ //search for the productCode in the helping array
            if(helpingArray[z][0]==wantedCode){
              orderNum = z; //save the line's number if the productCode is found in the helpingArray
              continue;
            }
          }
          var productRecord;
          if (orderNum == -1){ //there was no line with the productCode found the helpingArray => create a new record
                productRecord = app.models.testDB.newRecord();
                productRecord.productCode = data[i][0]; 
          }
          else{ //the productCode was found in the helpingArray => edit the existing record in the DB 
                productRecord = records[orderNum];
          }
          //provide the record with the data
          productRecord.availability = data[i][1];
          productRecord.price = data[i][2];

          //cumulate records and save them once the batch round is finished
          recordsToSave.push(productRecord); 
        }

        //a batch round has been finished, save records if there are any
        if(recordsToSave.length){
            console.log("Records saved: "+recordsToSave.length);
            app.saveRecords(recordsToSave);
        }

        currentItem += batchLimit;
        round++; 

        if (totalRecords < round*batchLimit){
          roundlimit = totalRecords;  
        }
        else{
          roundlimit += batchLimit;    
        }
    } while(round <= totalRounds);
}
0

This issue can be solved by using the proper server scripting and I strongly encourage you to read how server scripting works in the official documentation. Pay close attention at the Querying records example.

So, taking as base the previous solution example, it can be slightly modified to achieve your needs. This is how it should look:

function saveData(){

    //get the data
    var data = getData();

    var totalRecords = data.length;
    var batchLimit = 2000;
    var totalRounds = Math.ceil(totalRecords / batchLimit);
    var round = 1;
    var roundlimit = batchLimit;
    var currentItem = 0;

    do{
        var recordsToSave = [];

        for(var i=currentItem; i<roundlimit; i++){

            var recordData = data[i];
            var productCode = recordData[0];

            //check if the record already exists
            var query = app.models.testDB.newQuery();
            query.filters.productCode._equals = productCode;
            var productRecord = query.run()[0];

            //if non existent, create a new one
            if(!productRecord){
                productRecord = app.models.testDB.newRecord();
                productRecord.productCode = productCode;
            }

            productRecord.availability = recordData[1];
            productRecord.price = recordData[2];
            recordsToSave.push(newProduct);
        }

        if(recordsToSave.length){
            app.saveRecords(recordsToSave);
        }

        currentItem += batchLimit;
        roundlimit += batchLimit;
        round++;

    } while(round <= totalRounds);

}
E_net4
  • 27,810
  • 13
  • 101
  • 139
Morfinismo
  • 4,985
  • 4
  • 19
  • 36
  • Thank you very much again! It reduced the time by 50% (126 ms per record in the DB). Unfortunately, the line "var productRecord=query.run()[0];" takes 122 ms. As for your recommendation, I am now fully aware of the guidelines. I will really think twice before posting anything again. I hope this thread and your answer is going to help someone in the future though. My guess now is moving the query.run() out of the loop, creating an array of the DB (iD,productCode), and use this array to assign "iD" to the records in the loop (primary key in App Maker) while saving (app.saveRecords). – Daniel Kacko Apr 02 '19 at 16:56
  • @DanielKacko please proceed with that. I think you should get better results. Good luck! – Morfinismo Apr 02 '19 at 17:02
  • Thank you for your knowledgeable go-ahead! If it succeeds, I will post it as an answer to this question. Nevertheless, your codes helped me a lot, after two weeks of struggling with this issue. – Daniel Kacko Apr 02 '19 at 17:31