I want to add robots.txt to my Laravel project but robots.txt packages I found are not compatible with Laravel 9+ so if you know there is any tutorial or package for latest version of Laravel, please share. Thanks.
Asked
Active
Viewed 517 times
-2
-
1Why do you need a specific package for robots.txt, it's just a text file. Do you need to generate it dynamically? – apokryfos Feb 19 '23 at 09:50
-
wow, I don't know about it. I want Google to crawl my website instead of any restricted pages like admin. – Leslie Joe Feb 19 '23 at 10:02
2 Answers
0
To add a robots.txt file for a Laravel 9+ application, follow these steps:
Create a new file named robots.txt in the public directory of your Laravel project. You can use any text editor to create this file.
Open the
robots.txt
file and add the following lines:
User-agent: *
Disallow: /admin

Madan Sapkota
- 25,047
- 11
- 113
- 117
-
Great thanks a lot. I will do it right away. How do I stop Google from crawling client account pages? – Leslie Joe Feb 19 '23 at 10:03
-
0
Use the URL in your Routes files (web.php ++) as reference. Work through each route, adding each /names-of-different-areas-in-my-site/..
listing each route/directory you know to contain files that shouldn't be crawled, in your robots.txt file under a Disallow: block/column/array (verbose terminology here, because each to their own and it's anyones guess how you wanna term them)

BasketCase
- 1
- 3