I have an application which uses Express as user-facing framework for my REST API, together with RabbitMQ for RPC-like function calls to a clustered backend. Also, I use Q to promisify all my workload in the Routes.
In one of the Routes I'm using, I trigger some functionality which crawls a URL specified in the Route's parameters, does GeoIP lookups, normalizes result formats etc. This can take several seconds, depending on the response times of the crawled URL's servers.
What I would like to achieve is that the user that POSTs a new URL to crawl gets an immediate feedback to his request (status 200 = "Crawling request acknowledged"), and not having the request waiting for the crawling to finish.
My ideas are either
- Sending the URL to a specific queue in RabbitMQ, and have another process to listen to the queue's jobs
- Using something like child processes inside the Express Routes
What would be to best solution to solve this? Thanks for your valuable input.