What is the best possible and most advanced and up-to-date way to add robots.txt to a node application?
Can this be handled better at the web server level?
P.S: I'm using Mongo as DB, and Nginx as a web server.
What is the best possible and most advanced and up-to-date way to add robots.txt to a node application?
Can this be handled better at the web server level?
P.S: I'm using Mongo as DB, and Nginx as a web server.
Use a middleware function. This way, you can handle different robots.txt files for different environments such as production and development.
app.use('/robots.txt', function (req, res, next) {
res.type('text/plain')
res.send("User-agent: *\nDisallow: /");
});
You can also serve the files in the folders of the environments that they belong to.
if (app.settings.env === 'production') {
app.use(express['static'](__dirname + '/production')); // Uses the robots.txt from production folder
} else {
app.use(express['static'](__dirname + '/development')); // Uses the robots.txt from development folder
}
Related post: What is the smartest way to handle robots.txt in Express?