7

I want to create a robots.txt for my asp.net mvc-5 web site, now I find this link which talks about achieving this task:-

http://rehansaeed.com/dynamically-generating-robots-txt-using-asp-net-mvc/

where in this link they are creating a separate Controller & Route rule to build the robots.txt ,,,so I am not sure why i can not just create the robot.txt file and add it to the root of my web site as follow:-:-

enter image description here

where if I navigate to the following URL http://www.mywebsite.com/robots.txt the text content will be shown ,without having to create separate controller for this?

so my question is if it is valid to add the robots.txt directly to my web site root without having to do so inside a controller and separate route rule , to keep things simpler ??

Community
  • 1
  • 1
  • 2
    Yes, go ahead and add `robots.txt` manually. Those links are building dynamic site maps (not the robot file) so you wouldn't need to update the sitemap xml manually. – Jasen Jan 17 '16 at 02:47
  • @jasen sorry seems I provide a wrong URL, I updated my question with the correct url I am talking about , where in that example they define a route rule as follow [Route("robots.txt", Name = "GetRobotsText"), OutputCache(Duration = 86400)] –  Jan 17 '16 at 03:01
  • 1
    I can't remember that last time I said _"I wish my robots file were real-time dynamically generated"_. But I guess it's cool. – Jasen Jan 17 '16 at 04:53

3 Answers3

6

On my website, asp net core 3.1, I just used the below simple action:

    [Route("/robots.txt")]
    public ContentResult RobotsTxt()
    {
        var sb = new StringBuilder();
        sb.AppendLine("User-agent: *")
            .AppendLine("Disallow:")
            .Append("sitemap: ")
            .Append(this.Request.Scheme)
            .Append("://")
            .Append(this.Request.Host)
            .AppendLine("/sitemap.xml");

        return this.Content(sb.ToString(), "text/plain", Encoding.UTF8);
    }
  • 1
    For a simple/static robots.txt I find it easier to return directly from endpoint routing, such as `endpoints.MapGet("/robots.txt", async context => await context.Response.WriteAsync("User-Agent: *\nAllow: /"));` – k3davis Mar 13 '21 at 14:55
1

If you need to dynamically generate the content, then that makes sense because it would give you that ability. But otherwise, I use a robots.txt file in my applications with no problems.

Brian Mains
  • 50,520
  • 35
  • 148
  • 257
1

Placing a robots.txt file in the MVC application root as shown in your screenshot, the same place as Startup.cs, worked well for me.