I am working on an eCommerce site and it is creating URLs after user searches in a search bar or, filter products.
Like if a user searches for an apple the website will have a URL like example.com/s=apple? or something like that and the same for if the user uses a filter feature it will create another URL and google is crawling it.
I need a suggestion if I need to block the crawler or leave it. Can anyone tell me the robot.txt rule to block the bot from crawling those user-generated URLs?