As we know, robots.txt helps us avoid indexing of certain webpages/section by web crawlers/robots. But there are certain disadvantages by using this method: 1. the web crawlers might not listen to robots.txt file; 2. you are exposing the folders you want to protect to everybody;
There is another way of blocking the folders you want to protect from crawlers? Keep in mind that those folders might be wanted to be accessible from the browser (like /admin).