How to Edit the robots.txt and sitemap.xml

Search engines try to load the robots.txt file before they start to browse the server. This files content can for example prevent the search engine from seeing the site.

A similar file is called sitemap.xml, where search engines can get a list of all the pages of the site that want to be taken into account.

Don't create the robots.txt or sitemap.xml file yourself. Let WordPress create them.


Many people go the the server with an SSH/SFTP-connection and create these files from an old habit. In WordPress's case this is wrong. You should let WordPress create the files on the fly aka when Google or Bing asks for the sites robots.txt-file, WordPress gives an answer to the search engine even though there is no robots.txt-file.

If you want to tailor the robots.txt-file you can do this by adding a theme to the functions.php file that is registered to the do_robotstxt event.

Example:

function example_disallow_directory() {
echo "User-agent: *" . PHP_EOL;
echo "Disallow: /kielletty/hakemisto/" . PHP_EOL;
}
add_action( 'do_robots.txt', 'example_disallow_directory' );


Same goes for the sitemap.xml file. WordPress's core doesn't do it itself but the the right way is to install WordPress's SEO-plugin that creates it. Seravo recommends the SEO-framework-plugin, because it is newer and lighter that its known competitor Yoast SEO.

More information about the subject can be found in the presentation below:


Did this solve your problem?