9

I am using JSONP to collect data from the user but do not require the user to get a response.

Therefore I want to send the user an instant response so they can continue without having to wait for the server processing.

How do I send them a response but continue processing?

I'm using Google Script but I guess any javascript way to return a response and continue processing should work too.

I have something like:

function handleResponse(e) {
    //do something with e
    return ContentService
      .createTextOutput('console.log("updated")')
      .setMimeType(ContentService.MimeType.JAVASCRIPT);
}

I would like to return the response and then "do something with e".

Edit: Ok after a lot of mucking around I have a semi-working solution (There are always roadblocks!)

Currently I have:

var SCRIPT_PROP = PropertiesService.getScriptProperties();
function doGet(e){
    SCRIPT_PROP.setProperty("myParameters", e.parameters);
    ScriptApp.newTrigger("handleResponse")
      .timeBased()
      .after(20 * 1000)
      .create();
    return ContentService
      .createTextOutput('console.log("processing")')
      .setMimeType(ContentService.MimeType.JAVASCRIPT);
}
function handleResponse() {
    Logger.log(SCRIPT_PROP.getProperty("myParameters"));
}

What it's doing is saving the data from the user to a global like variable. Then it sets up a trigger to execute the handleResponse() function after 20 seconds. And finally it returns something to the user so they do not have to wait for the handleResponse() function to finish.

Now for the problems I am having with this solution, it seems to be hit and miss, it will sometimes fire the handleResponse() function and sometimes never do it.

From the docs it says that triggers will execute at the time you specify +/- 15 minutes! Now in the times that it works I have seen it take from 10 seconds to 45 seconds. In the times that it has not worked I have waited 20 minutes and still nothing. It seems the shorter I set the trigger, the more times it never executes.

The other problem I have is that I can only have 14 triggers at once so if they decide to take 15 minutes to execute I can easily hit that limit.

Is there any other way to get a solution like this to work?

Craig
  • 2,093
  • 3
  • 28
  • 43
  • 1
    how is `doGet` called? also, seems to me, if `doGet` is called multiple times, before any previous `handleResponse` triggers finish, you could get the wrong parameters from your global variable. or maybe i'm missing something... – WhiteHat Sep 12 '15 at 21:12
  • @WhiteHat doGet is called whenever the endpoint is hit. Yes you are right in this example I am not setting a variable property name, the ideal solution would. – Craig Sep 13 '15 at 04:31
  • "so they can continue" - with what, some kind of workflow on the frontend or with sending more requests to the server ? – Bryan P Sep 14 '15 at 21:01
  • So they can continue on the frontend without the browser swirling saying the page is still loading, I'm using async so the user can still use the page but the browser acts like it's still loading the page, which I do not want. – Craig Sep 14 '15 at 22:46
  • Hi, Do you see this article http://ramblings.mcpher.com/Home/excelquirks/parallel I think that is almost what you want to achieve ? – St3ph Sep 15 '15 at 07:09
  • It would be helpful to see the code that is calling the Apps Script, is it served by Apps Script or hosted elsewhere? – Cameron Roberts Sep 16 '15 at 16:58

5 Answers5

3

if the only requirement is to stop the window loading action on the frontend, as clarified in the comments, then consider wrapping the script tag insertion inside setTimeout()...

function createScriptTag() {

  setTimeout( function() {  // throw task in JS queue to prevent window load
    var scriptTag = document.createElement('script');
    scriptTag.src = "SCRIPT_URL?prefix=myCallback"; // include parameters if needed
    document.body.appendChild(scriptTag);
  }, 0);

}

createScriptTag();
Bryan P
  • 5,031
  • 3
  • 30
  • 44
2

If the page your user is viewing is served from Google Apps Script, you can use google.script.run to make asynchronous calls to your server side applications.

See: https://developers.google.com/apps-script/guides/html/reference/run

Cameron Roberts
  • 7,127
  • 1
  • 20
  • 32
1

I would create properties with 'tasks' and have one trigger running every minute that just checks what needs to be done.

On trigger:

var tasks = SCRIPT_PROP.getProperties() 

for (var key in tasks) {
handleTasks(tasks[key]);
SCRIPT_PROP.deleteProperty(key);
}
Riël
  • 1,251
  • 1
  • 16
  • 31
1

What you can do (if you don't mind the data being processed in batches) is to have just one timeBased everyMinutes trigger, which calls function to process all of your data. Of course, if you need data to be processed somewhat fast (like in 5-10 seconds after submission), then it's likely not going to work for you.

You also don't have to use PropertiesService, I guess it would be more simple to just use global array. Push new data into it and when triggered function runs, delete all processed entries from array.

Of course it's not a perfect solution, but unfortunately it's as far and as good as you can get without using Google's sandboxed HTML preprocessed outputs. Pages like that have access to google.script.run function, which would solve your problem. But as I can see, you're running the standalone API-like script, so that won't work for you.

Vyacheslav Pukhanov
  • 1,788
  • 1
  • 18
  • 29
  • Ideally it would be processed sooner. I tried a global array and it was always undefined when the function was triggered, not sure if GAS is different to Javascript for this. I was wondering if I could use UrlFetch and send the GET info as POST to the same script and use doPost() to process it right away. Not sure if I can get the script to finish before the UrlFetch completes though, will have to experiment. – Craig Sep 13 '15 at 04:40
  • @Craig actually yesterday, thinking about your question, I've thought about using UrlFetch for it. The downside is that it's still "blocking", so your script will wait until it gets a response from UrlFetch, so you would encounter the same problem once again. TL;DR UrlFetch won't help you in this situation – Vyacheslav Pukhanov Sep 13 '15 at 07:29
1

What you need is a message queuing system. There is one built-in to the Google Cloud platform called Pub/Sub. This method is not a trivial setup, nor is it too difficult. In fact 90% of the setup can be done in the API console interface. The last bit can be handled with a pub/sub library found on my github.
https://github.com/Spencer-Easton/Apps-Script-PubSubApp-Library

The basic outline follows:

1) User submission->Form Collection adds response to pub/sub responseQueue->Form Collection responds to form submit  

2) Pub/Sub responseQueue->Form Processor preforms business logic->Form Processor adds response to processedQueue or Audit Log.

Note: Steps 1 and 2 run asynchronously from each other.

Form Collection Script
1) Add pubsub library: Mk1rOXBN8cJD6nl0qc9x5ukMLm9v2IJHf
2) Add GSApp library: MJ5317VIFJyKpi9HCkXOfS0MLm9v2IJHf
3) Open the scripts Developers Console Project.
--a) Add the pub/sub api.
--b) Add a service account under credentials. Download key as json.
--c) Open Big Data -> Pub/Sub in nav. menu
--d) Create the create topics: responseQueue
--e) Leave this open as we will set permissions for this topic from the Form Process script later
4) Copy the contents of the jsonKey to the scripts properties save it as jsonKey
5) Add the snippet of code:

function getTokenService(){
  var jsonKey = JSON.parse(PropertiesService.getScriptProperties().getProperty("jsonKey"));  
  var privateKey = jsonKey.private_key;
  var serviceAccountEmail = jsonKey.client_email; 
  var sa = GSApp.init(privateKey, ['https://www.googleapis.com/auth/pubsub'], serviceAccountEmail);
  sa.addUser(serviceAccountEmail)
  .requestToken();
  return sa.tokenService(serviceAccountEmail);
}

6) Below is an example doGet() function:

function doGet(e){
  try{
    PubSubApp.setTokenService(getTokenService());
    //Don't forget to set this to your scripts projectID      
    var pub = PubSubApp.PublishingApp('api-project-YourAPIProjectID');                     
    var message = pub.newMessage();
    message.data = Utilities.base64Encode(e.parameter.response);
    pub.getTopic('responseQueue').publish(message);
    return ContentService
  .createTextOutput(e.parameter.callback+'(console.log("processing"))')
  .setMimeType(ContentService.MimeType.JAVASCRIPT);
  }catch(e){throw new Error(e)}
}

Form Processor Script
1) Add this example snippet of code:

function doPost(e) {
  var postBody = JSON.parse(e.postData.getDataAsString());
  var messageData = Utilities.newBlob(Utilities.base64Decode(postBody.message.data)).getDataAsString();  
  //Spreadsheet is used for an audit log. Add you own spreadshhetid here.     
  var ss = SpreadsheetApp.openById(SpreadSheetId).getSheetByName("Log");
  ss.appendRow([new Date(), messageData, JSON.stringify(postBody,undefined,2)])
  return 200;
}

2) Publish Script as a Web App
3) Deploy script to the Chrome Store. (You can leave it in draft mode)
4) Get the scripts deployed URL and save it for later. It will look like:

https://script.google.com/a/macros/{DOMAIN}/s/AKfycbyEeHW32Pa...5gLHa/exec

5) Open the scripts Developers Console Project.
--a) Add the pub/sub api.
--b) Add a service account under credentials. Download key as json.
6) Go back to the Form Collection Dev Console
--a) Add the URL from step 5 to APIs & Auth -> Push -> Add Domain
--b) In the From Collection Dev Console Pub/Sub setting add the service account email from step 5 to the responseQueue permissions as a subscriber.
--c) Click Add Subscriber on the responseQueue topic. Give the subscription a memorable name like formProcessorOne. Select Push then enter the same URL you got in step 4.
--d) Click More Options enter a Timeout limit under acknowledge deadline. This is the amount of time you would expect your Form Processing script to finish in. If the deadline passes with no acknowledgement the message goes back into the queue.

Finally
When messages get posted to your Form Collector the response parameter is added to the responseQueue. The responseQueue pushes the message to the From Processor script. Note: Depending on the amount of traffic and the length of the business logic on you Form Processor you may get the too many concurrent scripts error. Never fear because you set a timeout the message goes back to the queue and will be attempted again.

So this may possibly be overkill for you project depending on your scaling needs. The other answers in this thread are also both reasonable. If you are expecting low volume using the property service as a working queue would work with proper locking.

Also using HtmlService to host your html gives you access to google.script.run which is similar to jquerys $.get(), but maintains access and authentication with your script transparently.

Spencer Easton
  • 5,642
  • 1
  • 16
  • 25