Scenario is as follows:
Call to a specified URL including the Id
of a known SearchDefinition
should create a new Search
record in a db and return the new Search.Id
.
Before returning the Id
, I need to spawn a new process / start async execution of a PHP file which takes in the new Search.Id
and does the searching.
The UI then polls a 3rd PHP script to get status of the search (2nd script keeps updating search record in the Db).
This gives me a problem around spawning the 2nd PHP script in an async manner.
I'm going to be running this on a 3rd party server so have little control over permissions. As such, I'd prefer to avoid a cron job/similar polling for new Search records (and I don't really like polling if I can avoid it). I'm not a great fan of having to use a web server for work which is not web-related but to avoid permissions issues it may be required.
This seems to leave me 2 options:
- Calling the 1st script returns the Id and closes the connection but continues executing and actually does the search (ie stick script 2 at the end of script 1 but close response at the append point)
- Launch a second PHP script in an asynchronous manner.
I'm not sure how either of the above could be accomplished. The first still feels nasty.
If it's necessary to use CURL or similar to fake a web call, I'll do it but I was hoping for some kind of convenient multi-threading approach where I simply spawn a new thread and point it at the appropriate function and permissions would be inherited from the caller (ie web server user).