-2

I am working on an eCommerce site and it is creating URLs after user searches in a search bar or, filter products.

Like if a user searches for an apple the website will have a URL like example.com/s=apple? or something like that and the same for if the user uses a filter feature it will create another URL and google is crawling it.

I need a suggestion if I need to block the crawler or leave it. Can anyone tell me the robot.txt rule to block the bot from crawling those user-generated URLs?

  • I’m voting to close this question because, from the seo tag: Note: General SEO questions are off-topic. Only programming-related SEO questions are acceptable on Stack Overflow. Non-programming SEO questions should be asked on Webmasters.SE at https://webmasters.stackexchange.com. – Rob Sep 08 '22 at 12:38
  • This has already be asked and answered on Webmasters: [Prevent indexing of site search results](https://webmasters.stackexchange.com/questions/107962/prevent-indexing-of-site-search-results) – Stephen Ostermiller Sep 10 '22 at 09:33

1 Answers1

0

You must do this. Failure to do this will result in duplicate pages. Try this rule of code and be sure to take a live test in the search console. Disallow : s or Disallow : /s

Test both of them

.