How to edit robots.txt and sitemap.xml

A common procedure for search engines is to try load the robots.txt file before they start to browse the server, that is to index the site into the search engines database. This file's content can for example prevent the search engine from seeing the site.

A similar file is called sitemap.xml, where search engines can get a list of all the pages of the site that want to be taken into account.

Don't create the robots.txt or sitemap.xml file yourself – Let WordPress create them!

Many people go the the server with an SSH/SFTP-connection and create these files from an old habit. In WordPress's case this is wrong. You should let WordPress create the files on the fly. When Google or Bing asks for the sites robots.txt-file, WordPress gives an answer to the search engine even though there is no robots.txt file.

If you want to tailor the robots.txt-file you can do this by adding a function to the theme's functions.php file that is registered to the do_robotstxt event.

Example:

function example_disallow_directory() {
 echo "User-agent: *" . PHP_EOL;
 echo "Disallow: /forbidden/directory/" . PHP_EOL;
}
add_action( 'do_robotstxt', 'example_disallow_directory' );


Same goes for the sitemap.xml file. WordPress's core doesn't do it itself but the the right way is to install WordPress's SEO plugin that creates it. Seravo recommends the SEO Framework plugin, because it is newer and lighter that its known competitor Yoast SEO.

More information about the subject can be found in the presentation below:

Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.