To help search engines index your blog correctly, you need to make the right Robots.txt file for WordPress. Let’s see how to create it and what to fill it with.

What gives Robots.txt?

You need it for search engines to index the web resource correctly. The content of the file “tells” the search robot which pages to show in search and which to hide. This allows you to manage the content in search engine output.

Fill robots.txt already at the stage of site development. Its changes will not take effect immediately. It may take a week or several months.

 

Where is Robots.txt?

This usual test file is in the root directory of the site. It can be obtained at the following address

https://site.ru/robots.txt
WP does not initially create robots.txt. You need to do this manually or use tools that automatically create it.

Can’t find this file.

If the content of the file is displayed at the specified address, but it is not on the server, then it is created virtually. Searching doesn’t matter. The main thing is to make it available.

What does it consist of

Of the 4 main directives:

  • User-agent – rules for search robots.
  • Disalow – prohibits access.
  • Allow – allows.
  • Sitemap – the full URL to the XML map.

Correct robots.txt for WordPress

There are many variants. Instructions on each site are different.

Here is an example of the right Robots, which takes into account all sections of the site. Let’s take a short look at the directives.

User-agent: *

Disallow: /cgi-bin

Disallow: /wp-admin

Disallow: /wp-content/cache

Disallow: /wp-json/

Disallow: /wp-login.php

Disallow: /wp-register.php

Disallow: /xmlrpc.php

Disallow: /license.txt

Disallow: /readme.html

Disallow: /trackback/

Disallow: /comments/feed/

Disallow: /*?replytocom

Disallow: */feed

Disallow: */rss

Disallow: /author/

Disallow: /?

Disallow: /*?

Disallow: /?s=

Disallow: *&s=

Disallow: /search

Disallow: *?attachment_id=

Allow: /*.css

Allow: /*.js

Allow: /wp-content/uploads/

Allow: /wp-content/themes/

Allow: /wp-content/plugins/

Sitemap: https://site.ru/sitemap_index.xml

The first line indicates that the resource is available for all search robots (crawlers).

The Disallow directives prohibit the issuance of service directories and files, cached pages, authorization and registration sections, RSS feeds (Feed), authors’ pages, search and attachments.

Allow – allows you to add scripts, styles, upload files, themes and plugins to the index.

The last one is the address of an XML-card.

 

How to create robots.txt for your site

Let us consider several methods.

Manual

This can be done, for example, in Notepad (if the local server) or through an FTP client (hosting).

This can also be done with VP plugins. Let’s parse the best of them.

Yoast SEO

This powerful SEO module for WP will also solve the problem.

  1. Go to SEO > Tools.
  2. Click File Editor.
  3. If you do not see this file in the root directory, click Create robots.txt file. If there is, the Editor will open to make changes.
  4. Click Save changes to robots.txt.

All in One SEO Pack

This solution also “knows” how to work with Robots. For this purpose, it is necessary:

  1. Open All in One SEO > Modules.
  2. Select the module robots.txt and press Activate.
  3. Go All in One SEO > Robots.txt.
  4. Add the directives in the fields.