2

I'm using Nginx and pointing a couple of old domain names to a new site.

The first block in this config works fine for how I need old.domain to behave when redirecting to new.domain.

In the second block, I'm trying to forward any request for oldmediaserver.domain except /robots.txt to the homepage of new.domain. In its current state, every request redirects including for /robots.txt - and I can't work out why.

(The reason for this is I had something indexed by Google from the old domain, and I'm trying to remove it from search results via webmaster tools - that might not work, but that's not what I'm asking for help with here!).

# Old site to new site config

server {
    listen 80;
    listen [::]:80;

    server_name old.domain www.old.domain;

    rewrite     ^ $scheme://www.new.domain$request_uri permanent;
}

# Media server Redirect and Robots directive

server {
    listen 80;
    listen [::]:80;

    server_name oldmediaserver.domain www.oldmediaserver.domain;

    location / {
        rewrite     / $scheme://www.new.domain/ permanent;
    }


    location /robots.txt {
        return 200 "User-agent: *\nDisallow: /";
    }

    rewrite     ^ $scheme://www.new.domain/ permanent;

}


server {
    listen 80 default_server;
    listen [::]:80 default_server;

    root /var/www/website-name/html;

    # Add index.php to the list if you are using PHP
    index index.php index.html index.htm index.nginx-debian.html;

    server_name www.new.domain;

    location / {
        # First attempt to serve request as file, then
        # as directory, then fall back to displaying a 404.
        try_files $uri $uri/ /index.php?$args;
    }

    # pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000

    location ~ \.php$ {
        include snippets/fastcgi-php.conf;

        #   # With php5-fpm:
        fastcgi_pass unix:/var/run/php5-fpm.sock;
    }

    # include a file for any 301 redirects
    include includes/website-name-redirects;

  location /members/ {
      try_files $uri $uri/ /index.php?$args;
      auth_basic "Members Login";
      auth_basic_user_file /var/www/website-name/html/.htpasswd;

      location ~ \.php$ {
          include snippets/fastcgi-php.conf;

          # With php5-fpm:
          fastcgi_pass unix:/var/run/php5-fpm.sock;
      }

   }

  #!!! IMPORTANT !!! We need to hide the password file from prying eyes
  # This will deny access to any hidden file (beginning with a .period)
  location ~ /\. { deny  all; }

}

Thanks for any light you can shed!

Ben
  • 53
  • 5
  • In the second config block, try to remove the second `rewrite...` statement. – gxx Nov 08 '15 at 20:09
  • Thanks - part of the way there now - `www.oldmediaserver.domain/robots.txt` now returns correctly, and `www.oldmediaserver.domain/foo` forwards to `new.domain`'s homepage - however `oldmediaserver.domain/robots.txt` without _www_ still forwards rather than returning the robots directive? – Ben Nov 08 '15 at 20:21
  • Please turn on [debug logging](http://nginx.org/en/docs/debugging_log.html) and check the access log for the request, which isn't working as expected: Which server and location directive is choosen? – gxx Nov 08 '15 at 20:34
  • change to `location = /robots.txt` – Drifter104 Nov 08 '15 at 21:23
  • @gf_ thanks - I found it really hard to get through the debug information (I switched it on globally and maybe should have only done it at a server level?). Couldn't find anything there, but thanks for your assistance. @Drifter104 I didn't have much luck with using `=` for exact match, but you set me on the path to the answer (below) - so thanks! – Ben Nov 09 '15 at 21:16

1 Answers1

2

Thanks to gf_ and Drifter104 for the comments. Drifter104's comment about matching the location got me looking into the different matching patterns and eventually landing on the config below.

# Media server Redirect and Robots directive

server {
    listen 80;
    listen [::]:80;

    server_name oldmediaserver.domain www.oldmediaserver.domain;

    location ^~ / {
        rewrite     ^ $scheme://www.new.domain/ permanent;
    }

    location ^~ /robots.txt {
        return 200 "User-agent: *\nDisallow: /";
    }
}

I'm still not sure I understand fully why this works and the other didn't, so if anyone can shed any more light that would be great!

Ben
  • 53
  • 5