0

Ok so I want to add this

User-Agent: *
Disallow: /

to the robots.txt in all the enviroments other then production...any idea on the best want to do this. Should i remove it from the public folder and create a routes/views

I am using rails 3.0.14 prior to asset pipeline...any suggestions

Matt Elhotiby
  • 43,028
  • 85
  • 218
  • 321

2 Answers2

4

Capistrano task for uploading a blocking robots.txt

I wrote this up again today, same path as Sergio's answer essentially, but sharing robots-specific result might save someone time :)

namespace :deploy do
  desc "Uploads a robots.txt that mandates the site as off-limits to crawlers"
  task :block_robots, :roles => :app do
    content = [
      '# This is a staging site. Do not index.',
      'User-agent: *',
      'Disallow: /'
    ].join($/)

    logger.info "Uploading blocking robots.txt"
    put content, "#{current_path}/public/robots.txt"
  end
end

Then trigger it from your staging recipe with something like

after "deploy:update_code", "deploy:block_robots"
captainpete
  • 6,162
  • 3
  • 28
  • 26
3

Here's a real working code from my project (it's a nginx config, not robots.txt, but idea should be clear).

task :nginx_config do
  conf = <<-CONF
   server {
      listen 80;
      client_max_body_size 2M; 
      server_name #{domain_name};

      -- snip --
    }
  CONF

  put conf, "/etc/nginx/sites-available/#{application}_#{rails_env}"
end

So, basically, you create content of your file in a string and then do put to desired path. This will make capistrano upload the content through SFTP.

Sergio Tulentsev
  • 226,338
  • 43
  • 373
  • 367