3

I'm in the process of doing my homework for a move from apache w/o caching to nginx with caching, possibly via varnish...

After reading various blogs, articles, serverfault questions, etc. I understand that varnish cannot work with ssl, and that varnish might be better than nginx for actual caching of dynamic content. But I'm a little stuck in understanding how nginx caching works AND how nginx+varnish can play together when ssl is required.

How would the following be implemented, with nginx+varnish OR just nginx with caching?

  1. some urls driven by a custom php engine: i.e. example.com/this-page is served by example.com/index.php?p=this-page

  2. some urls are driven by wordpress: i.e. example.com/blog/this-article is handled by wordpress via example.com/blog/index.php?p=this-article

  3. should force ssl everywhere: i.e. http://example.com/* redirects to https://example.com/*

  4. www should redirect to top domain: i.e. http://www.example.com/* redirects to https://example.com/*

All of which serving the cached version if it exists (if I understand- caching is time based, so if I make an update to that page, I need to call PURGE if the varnish route and manually delete files if the nginx route)

davidkomer
  • 165
  • 6

1 Answers1

3

If you want to use nginx and Varnish I would suggest something like this.

  • nginx as the frontend and SSL terminator, configured with Varnish as backend. This will also normalize (rewrite) hostnames from e.g. www.example.org to example.org.
  • Varnish for caching content from its backend, nginx
  • nginx with virtualhosts running on e.g. port 8080 with all "application logic" like WordPress-rewrites, custom PHP-engine rewrites and such.

So basically you would have a stack like this:

nginx -> Varnish -> nginx -> php-fpm

The four bullet points you mention would then be solved like this.

  1. Backend-nginx with rewrites and proxy_pass to php-fpm.
  2. Backend-nginx with WordPress-related rewrites.
  3. Frontend-nginx with simple virtualhosts for redirections.
  4. Same as 3.
pkhamre
  • 6,120
  • 3
  • 17
  • 27
  • Thanks - so nginx frontend would be on public 80 and 443 (80 for forcing ssl), Varnish would be on localhost 80, and backend nginx would be on 8080? If that's right- will SSL certification break going from 80(nginx)->443(nginx)->80(varnish)->8080(nginx)? – davidkomer Sep 04 '12 at 12:01
  • Yeah, you are quite correct, but I would not run Varnish on port 80 localhost even though the port might be available. It will be confusing, so I suggest you just use the default Varnish port instead; 6081 – pkhamre Sep 04 '12 at 12:03
  • And about SSL certificate break; Nope, not if set up correctly. It will be HTTPS between client and server, but from the nginx-frontend to varnish and from varnish to nginx-backend it will be plain HTTP, don't use HTTPS in proxy unless you have to. – pkhamre Sep 04 '12 at 12:05
  • Thanks! Would you recommend this as opposed to keeping it all in nginx? I don't need ESI – davidkomer Sep 04 '12 at 13:34
  • Yes, definitely. But that is because I have lots of experience with Varnish for caching. I am not sure about nginx and caching, but I know Varnish is able to do pretty advanced caching. And it's very good at it! – pkhamre Sep 04 '12 at 13:54
  • But it's slower: http://todsul.com/nginx-varnish – VBart Sep 13 '12 at 00:09