1

I'm using Puma and Nginx to run my Rubinius app.

I would like to separate my URL requests.

The first one would be for API requests, the second one for other requests.

I think Puma already makes threading but I want to be sure that web requests won't block a thread what leads to a stop during my API requests. I suppose that if a thread is busy, Puma will create another one but I want to be sure that one is always available for API requests.

My main point here is to "save" a thread for URL requests which are what my users need the most.

halfer
  • 19,824
  • 17
  • 99
  • 186
brcebn
  • 1,571
  • 1
  • 23
  • 46

2 Answers2

0

As Puma consider each request within separated thread, the only bottleneck here is database access by such threads. Besides that, you cannot guarantee some threads are 'better' than others.

One of possible solutions worth noting here is to deal with it using nginx. Let's say you app is serving content on http://some_host.com and API is available within http://some_host.com/api. You can configure your nginx to handle requests for http://some_host.com and http://some_host.com/api separately. In this case you'll need two separate instances of Puma server. One for base app and one for api request. What I mean here is when a request comes to http://some_host.com, it is handled by Puma A and when to http://some_host.com/api/..., by Puma B.

Just remember one thing, you can handle requests by separated instances, but you still have only one database, unless you're caching content. Here comes another question. Do you cache your content? If not, wouldn't it be greater idea to start with caching first?

blelump
  • 3,233
  • 1
  • 16
  • 20
  • Thank you for your fast answer ! Before switching to Puma I used Passenger with 2 instances (like you described for Puma) but in my mind it's not the best way to do it. I thought it was a bit dirty. I had exactly what you said about `some_host.com` and `some_host.com/api`. API requests are doing streaming and I cannot cache those data because it's always a new content. I would like to be sure that a web request won't block streaming flow (during an upload of a big file for example). – brcebn Oct 28 '14 at 16:57
  • I see, interesting case. What do you stream? – blelump Oct 28 '14 at 17:06
  • Musics, videos and soon a lot of new contents. – brcebn Oct 28 '14 at 19:54
0

Why not to split main application and API? It's easy to serve two distinct applications with Nginx:

  location / {
    proxy_set_header  X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header  X-Forwarded-Proto $scheme;
    proxy_set_header  Host $http_host;
    proxy_redirect    off;
    proxy_pass        http://puma1;
  }


  location /api/ {
    proxy_set_header  X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header  X-Forwarded-Proto $scheme;
    proxy_set_header  Host $http_host;
    proxy_redirect    off;
    proxy_pass        http://puma2/;
  }

Please pay an attention on trailing slash for the second location proxy_pass, it helps to rewrite requests and omit '/api' prefix.

Anatoly
  • 15,298
  • 5
  • 53
  • 77