2

What is the best possible and most advanced and up-to-date way to add robots.txt to a node application?

Can this be handled better at the web server level?

P.S: I'm using Mongo as DB, and Nginx as a web server.

BillNathan
  • 589
  • 1
  • 6
  • 26

1 Answers1

1

Use a middleware function. This way, you can handle different robots.txt files for different environments such as production and development.

app.use('/robots.txt', function (req, res, next) {
    res.type('text/plain')
    res.send("User-agent: *\nDisallow: /");
});

You can also serve the files in the folders of the environments that they belong to.

if (app.settings.env === 'production') {
  app.use(express['static'](__dirname + '/production')); // Uses the robots.txt from production folder
} else {
  app.use(express['static'](__dirname + '/development')); // Uses the robots.txt from development folder
}

Related post: What is the smartest way to handle robots.txt in Express?

Fatih Aktaş
  • 1,446
  • 13
  • 25
  • Should I just create robots.txt and add it to the public/ directory? I've read that Express doesn't automatically serve static files from /public, does it? I read that you need to configure it to do so? – BillNathan Jun 12 '19 at 22:50
  • Yea, look up for short tutorials about creating a server with express. They explain it so well. Then, learn about how to serve static files. Robot.txt is just another file that you can serve as static. – Fatih Aktaş Jun 12 '19 at 22:53