Most of the questions I see are trying to hide the site from being indexed by search engines. For myself, I'm attempting the opposite.
For the robots.txt file, I've put the following:
# robots.txt
User-agent: *
Allow: /
# End robots.txt file
To me, this means that the search engines are allowed to search the directory. However, when I test it out it still displays the website as "A description for this result is not available because of this site's robots.txt" but when I clicked on the link, it's displaying the above code.
I'm guessing it's because it takes awhile for Google and Bing to catch up? Or am I doing something wrong?
If it's because they haven't caught up to the changes made yet (these changes were made yesterday afternoon), then does anyone have a rough estimate to when the changes will be reflected?