How to create the perfect robots.txt for WordPress

robots.txt for WordPress

To help search engines index your blog correctly, you need to make a proper Robots txt file for WordPress. Let’s see how to create it and how to fill it.

What does Robots.txt give?

It is needed for search engines to correctly index a web resource. The content of the file “tells” the search robot which pages to show in the search and which to hide. This allows you to manage the content in the search results.

Where is Robots located?

This regular test file is located in the root directory of the site. It can be obtained from

https://site.ru/robots.txt

Can’t find this file

If the contents of the file are displayed at the specified address, but it is not on the server, then it is created virtually. The search engine doesn’t care. The main thing is that it is available.

What does it consist of

4 main directives:

  • User-agent – rules for search robots.
  • Disallow – Denies access.
  • Allow – allows.
  • Sitemap – The full URL of the XML map.

Also see: How to Remove WordPress Archives

Correct robots.txt for WordPress

There are many options. The instructions are different for each site.

Here is an example of the correct Robots, which takes into account all sections of the site. Let’s briefly analyze the directives.

User-agent: *
Disallow: /cgi-bin
Disallow: /wp-admin
Disallow: /wp-content/cache
Disallow: /wp-json/
Disallow: /wp-login.php
Disallow: /wp-register.php
Disallow: /xmlrpc.php
Disallow: /license.txt
Disallow: /readme.html
Disallow: /trackback/
Disallow: /comments/feed/
Disallow: /*?replytocom
Disallow: */feed
Disallow: */rss
Disallow: /author/
Disallow: /?
Disallow: /*?
Disallow: /?s=
Disallow: *&s=
Disallow: /search
Disallow: *?attachment_id=
Allow: /*.css
Allow: /*.js
Allow: /wp-content/uploads/
Allow: /wp-content/themes/
Allow: /wp-content/plugins/
Sitemap: https://site.ru/sitemap_index.xml

The first line indicates that the resource is available to all search robots (crawlers).

The Disallow directives prohibit the search for service directories and files, cached pages, authorization and registration sections, RSS (Feed), author, search, and attachment pages.

Allow allows you to add scripts, styles, downloads, themes and plugins to the index.

The last one is the address of the XML map.

How to create robots.txt for a website

Let’s consider several methods.

Manually

This can be done, for example, in Notepad (if the local server) or through an FTP client (on the hosting).

This can also be achieved with WP plugins. Let’s analyze the best of them.

ClearfyPro

Clearfy Pro creates a virtual file. For this:

  1. Go to the Clearfy Pro admin menu.
  2. On the SEO tab, enable the Generate correct robots.txt option.
  3. Fill in the contents of the file.
  4. Save your changes.

 

Leave a comment

Your email address will not be published.