To help search engines index your blog correctly, you need to make the right Robots.txt file for WordPress. Let’s see how to create it and what to fill it with.
What gives Robots.txt?
You need it for search engines to index the web resource correctly. The content of the file “tells” the search robot which pages to show in search and which to hide. This allows you to manage the content in search engine output.
Where is Robots.txt?
This usual test file is in the root directory of the site. It can be obtained at the following address
Can’t find this file.
If the content of the file is displayed at the specified address, but it is not on the server, then it is created virtually. Searching doesn’t matter. The main thing is to make it available.
What does it consist of
Of the 4 main directives:
- User-agent – rules for search robots.
- Disalow – prohibits access.
- Allow – allows.
- Sitemap – the full URL to the XML map.
Correct robots.txt for WordPress
There are many variants. Instructions on each site are different.
Here is an example of the right Robots, which takes into account all sections of the site. Let’s take a short look at the directives.
The first line indicates that the resource is available for all search robots (crawlers).
The Disallow directives prohibit the issuance of service directories and files, cached pages, authorization and registration sections, RSS feeds (Feed), authors’ pages, search and attachments.
Allow – allows you to add scripts, styles, upload files, themes and plugins to the index.
The last one is the address of an XML-card.
How to create robots.txt for your site
Let us consider several methods.
This can be done, for example, in Notepad (if the local server) or through an FTP client (hosting).
This can also be done with VP plugins. Let’s parse the best of them.
This powerful SEO module for WP will also solve the problem.
- Go to SEO > Tools.
- Click File Editor.
- If you do not see this file in the root directory, click Create robots.txt file. If there is, the Editor will open to make changes.
- Click Save changes to robots.txt.
All in One SEO Pack
This solution also “knows” how to work with Robots. For this purpose, it is necessary:
- Open All in One SEO > Modules.
- Select the module robots.txt and press Activate.
- Go All in One SEO > Robots.txt.
- Add the directives in the fields.