How to Optimize WordPress Robots.txt File for SEO

Have you optimized your WordPress Robots.txt file for SEO? If not, you are ignoring an essential aspect of SEO that plays a significant role

By Claudio Pires
Updated on August 4, 2022
How to Optimize WordPress Robots.txt File for SEO

Have you optimized your WordPress Robots.txt file for SEO? If you haven’t, you are ignoring an essential aspect of SEO. Robots.txt file plays a significant role in your site’s SEO. You are lucky that WordPress automatically creates a Robots.txt file for you. Having this file is half of the battle. You must ensure that the Robots.txt file is optimized to get the full benefits.

The Robots.txt file tells search engine bots what pages to crawl and avoid. This post will show you how to edit and optimize the Robots.txt files in WordPress.

Optimize WordPress Robots.txt File for SEO

How to Optimize WordPress Robots.txt File for SEO?
How to Optimize WordPress Robots.txt File for SEO?

What is Robots.txt File?

Let’s start with the basics. The Robots.txt file is a text file that instructs search engine bots to crawl and index a site. Whenever any search engine bots come to your site, they read the robots.txt file and follow the instructions. Using this file, you can specify which part of your site to crawl and which function to avoid. However, the absence of robots.txt will not stop search engine bots from crawling and indexing your site.

Editing & Understanding Robots.txt in WordPress

I’ve already said that every WordPress site has a default robots.txt file in the root directory. You can check your robots.txt by going to http://yourdomain.com/robots.txt. For example, you can review our robots.txt file from here: https://visualmodo.com/robots.txt

If you don’t have a robots.txt file, you’ll have to create one. It’s very easy to do. Just create a text file on your computer, save it as robots.txt, and upload it to your root directory. You can upload it via FTP Manager or cPanel File Manager.

Editing the Files

Now let’s see how to edit your robots.txt file.

You can edit your robots.txt file using FTP Manager or cPanel File Manager. But it’s time-consuming and a bit difficult.

The best way to edit the Robots.txt files is, by using a plugin. There are several WordPress robots.txt plugins out there. I prefer Yoast SEO. This is the best SEO plugin for WordPress. I’ve already shared how to set up Yoast SEO.

Yoast SEO allows you to modify the robots.txt file from your WordPress admin area. However, if you don’t want to use the Yoast plugin, you can use other plugins like WP Robots Txt.

Video Tutorial

Once you’ve installed and activated the Yoast SEO plugin, go to WordPress Admin Panel > SEO > Tools.

Then click on “File editor”.

Secondly, click on “File editor”.

Then you need to click on “Create robots.txt file.”

Then you need to click on “Create robots.txt file.”
robots.txt file edit

Then you will get the Robots.txt file editor. You can configure your robots.txt file from here.

Then you will get the Robots.txt file editor
Files

Before editing the file, you need to understand the commands of the file. There are three commands mainly.

  • User-agent – Defines the name of the search engine bots like Googlebot or Bingbot. You can use an asterisk (*) to refer to all search engine bots.
  • Disallow – Instructs search engines not to crawl and index some parts of your site.
  • Allow – Instructs search engines to crawl and index which parts you want to index.

Here’s a sample of the Robots.txt file.

User-agent: *
Disallow: /wp-admin/
Allow: /

This robots.txt file instructs all search engine bots to crawl the site. The 2nd line tells search engine bots not to crawl the /wp-admin/ part. In 3rd line, it instructs search engine bots to crawl and index the whole website.

Configuring & Optimizing Robots.txt File for SEO

A simple misconfigure in the Robots.txt file can completely deindex your site from search engines. For example, if you use the command “Disallow: /” in the Robots.txt file, your site will be deindexed from search engines. So you need to be careful while configuring.

Another important thing is the optimization of the Robots.txt file for SEO. Before going to the best practices of Robots.txt SEO, I’d like to warn you about some bad practices.

  • Don’t use the Robots.txt file to hide low-quality content. The best practice is to use noindex and nofollow meta tag. You can do this by using Yoast SEO plugin.
  • Don’t use the Robots.txt file to stop search engines from indexing your Categories, Tags, Archives, Author pages, etc. You can add nofollow and noindex meta tags to those pages by using Yoast SEO plugin.
  • Don’t use the Robots.txt file to handle duplicate content. There are other ways.

Now let’s see how you can make the Robots.txt file SEO friendly.

  1. Firstly, you need to determine which parts of your site you don’t want search engine bots to crawl. I prefer disallowing /wp-admin/, /wp-content/plugins/, /readme.html, /trackback/.
  2. Secondly, adding “Allow: /” derivatives on Robots.txt file is not crucial as bots will crawl your site anyway. But you can use it for the particular bot.
  3. Finally, adding sitemaps to the Robots.txt file is also a good practice. Read: How to create Sitemap

Here’s an example of an ideal Robots.txt file for WordPress.

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-content/plugins/
Disallow: /readme.html
Disallow: /trackback/
Disallow: /go/
Allow: /wp-admin/admin-ajax.php
Allow: /wp-content/uploads/
Sitemap: https://roadtoblogging.com/post-sitemap.xml
Sitemap: https://roadtoblogging.com/page-sitemap.xml

You can check the RTB Robots.txt file here: https://visualmodo.com/robots.txt

Testing Robots.txt File in Google Webmaster Tools

After updating your Robots.txt file, you have to test the Robots.txt file to check if the update impacts any content.

You can use Google Search Console to check if there is any “Error” or “Warning” for your Robots.txt file. Just log in to Google Search Console and select the site. Then go to Crawl > robots.txt Tester and click on the “Submit” button.

Google Search Console and select the site. Then go to Crawl > robots.txt
Google Search Console

A box will be popped up. Just click on the “Submit” button.

ask google to update file
Google update file

Then reload the page and check if the file is updated. So, It might take some time to update the Robots.txt file.

If it hasn’t been updated, you can enter your Robots.txt file code into the box to check if there are any errors or warnings. Moreover, It will show the errors and warnings there.

Erros and warning in txt files google test
Robots.txt file code

So, If you notice any errors or warnings in the robots.txt file, you have to fix them by editing the robots.txt file.

Final Thoughts

In conclusion, I hope this post helped you to optimize your WordPress robots.txt file. If you are confused about this, feel free to ask us via comments.

However, to make your WordPress blog SEO friendly, you can read our post on How to Setup WordPress Yoast SEO Plugin.

Finally, If you find this post helpful, please help me by sharing this post on Facebook and Twitter.

Claudio Pires

Claudio Pires is the co-founder of Visualmodo, a renowned company in web development and design. With over 15 years of experience, Claudio has honed his skills in content creation, web development support, and senior web designer. A trilingual expert fluent in English, Portuguese, and Spanish, he brings a global perspective to his work. Beyond his professional endeavors, Claudio is an active YouTuber, sharing his insights and expertise with a broader audience. Based in Brazil, Claudio continues to push the boundaries of web design and digital content, making him a pivotal figure in the industry.