I am making a plugin for Wordpress with a function to update the file robots.txt or to create it if not existing yet.
So far I have this function:
function roots_robots() {
echo "Disallow: /cgi-bin\n";
echo "Disallow: /wp-admin\n";
echo "Disallow: /wp-includes\n";
echo "Disallow: /wp-content/plugins\n";
echo "Disallow: /plugins\n";
echo "Disallow: /wp-content/cache\n";
echo "Disallow: /wp-content/themes\n";
echo "Disallow: /trackback\n";
echo "Disallow: /feed\n";
echo "Disallow: /comments\n";
echo "Disallow: /category/*/*\n";
echo "Disallow: */trackback\n";
echo "Disallow: */feed\n";
echo "Disallow: */comments\n";
echo "Disallow: /*?*\n";
echo "Disallow: /*?\n";
echo "Allow: /wp-content/uploads\n";
echo "Allow: /assets\n";
echo "\n";
}
add_action('do_robots', 'roots_robots');
The file robots.txt is not updating, did I forget anything? Also is there a way to check first if existing and if not creating the file?
I found something from the plugin kb-robots but I'm not 100% sure how to add it to my function..
function kb_robotstxt(){
# this is to make it work for demos and testing. Without this, plugin would only act when robots.txt is in a valid place. With this, it will act whenever robots.txt is appended to blog URL
# (even if blog is in a subdirectory)
$request = str_replace( get_bloginfo('url'), '', 'http://'.$_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI'] );
if ( (get_bloginfo('url').'/robots.txt' != 'http://'.$_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI']) && ('/robots.txt' != $_SERVER['REQUEST_URI']) && ('robots.txt' != $_SERVER['REQUEST_URI']) )
return; // checking whether they're requesting robots.txt
$robotstxt = get_option('kb_robotstxt');
if ( !$robotstxt)
return;
header('Content-type: text/plain');
print $robotstxt;
die;
}
Thanks!