Have you heard about Robots.txt file? If you are familiar with WordPress, then you might know Robots file. It has a very crucial influence on the SEO performance of your site. A well-optimized Robots file can improve your site’s search engine ranking. On the other hand, a wrongly configured Robots.txt file would badly impact your site’s SEO. Learn in this guide how to create, set up & improve your WordPress website Robots.txt file, optimizing it for SEO.
Automatically, WordPress generates a Robots.txt file for your website. But still, you need to take some actions to optimize it properly.
There are so many other factors for SEO, but this file is inevitable. Since its editing involves using some line of codes, most of the website owners hesitate to make changes to it. Moreover, you don’t need to worry. Today’s article covers its importance and how to optimize WordPress robots file for better SEO. Before moving further, let’s learn some fundamental things.
What is Robots.txt File?
Robots.txt is a text file webmasters create to instruct web robots (typically search engine robots) how to crawl pages on their websites. In addition, The robots.txt file is part of the robots exclusion protocol (REP). A group of web standards that regulate how robots crawl the web, access and index content, and serve that content up to users.
The REP also includes directives like meta robots. As well as page-, subdirectory-, or site-wide instructions for how search engines should treat links (such as “follow” or “nofollow”). Keep reading to learn in this guide how to create, set up & improve your WordPress website Robots.txt file, optimizing it for SEO performance and ranking.
Robots.txt in WordPress Usage
I’ve already said every WordPress site has a default robots.txt file in the root directory. You can check your robots.txt by going to http://yourdomain.com/robots.txt. For example, you can check our robots.txt file from here: https://visualmodo.com/robots.txt
If you don’t have a robots file, you’ll have to create one. It’s straightforward to do. Just create a text file in your computer and save it as .txt. Finally, upload it to your root directory. You can upload it via FTP Manager or cPanel File Manager.
Now let’s see how to edit your .txt file. You can edit your robots file by using FTP Manager or cPanel File Manager. However, it’s time-consuming and a bit difficult.
WordPress Robots Plugin
The best way to edit the Robots file is, using a plugin. There are several WordPress .txt plugins out there. I prefer Yoast SEO. This is the best SEO plugin for WordPress. I’ve already shared how to set up Yoast SEO. As a result, this plugin allows you to modify the robots file from your WordPress admin area.
However, if you don’t want to use the Yoast plugin. You can use other plugins like WP Robots Txt. Once you’ve installed and activated Yoast SEO plugin, go to WordPress Admin Panel > SEO > Tools.
Then click on “File editor”. after that, you need to click on “Create robots.txt file”. Then you will get the Robots.txt file editor. You can configure your robots file from here. Before editing the file, you need to understand the commands of the file. There are three commands mainly.
User-agent – Defines the name of the search engines bots like Googlebot or Bingbot. You can use an asterisk (*) to refer to all search engine bots. So, Disallow – Instructs search engines not to crawl and index some parts of your site. Allow – Instructs search engines to crawl and index which parts you want to index.
Here’s a sample of Robots.txt file.
User-agent: *
Disallow: /wp-admin/
Alow: /
This robots file instructs all search engine bots to crawl the site. The 2nd line tells search engine bots not to crawl the /wp-admin/ part. In 3rd line. So, it instructs search engine bots to crawl and in the dex the whole website.
WordPress Robots.txt Settings For a Better SEO
A simple misconfigure in Robots file can completely deindex your site from search engines. For example, if you use the command “Disallow: /” in Robots file, your site will be deindexed from search engines. So you need to be careful while configuring.
Another important thing is the optimization of Robots.txt file for SEO. Before going to the best practices of Robots SEO, I’d like to warn you about some bad practices.
- WordPress Robots file to hide low-quality contents. The best practice is to use noindex and nofollow meta tag. You can do this by using the Yoast SEO plugin.
- Robots.txt file to stop search engines to index your Categories, Tags, Archives, Author pages, etc. You can add nofollow and noindex meta tags to those pages by using the Yoast SEO plugin.
- Use Robots.txt file to handle duplicate content. There are other ways.
Make Robots file SEO-friendly.
Firstly, you need to determine which parts of your site you don’t want search engine bots to crawl. I prefer disallowing /wp-admin/, /wp-content/plugins/, /readme.html, /trackback/. Secondly, “Allow: /” derivatives on Robots file is not so important as bots will crawl your site anyway.
But you can use it for the particular bot. Adding sitemaps to Robots file is also a good practice. In addition, read this article about WordPress sitemaps.
Here’ an example of ideal .txt file for WordPress.
User-agent: *
Disallow: /wp-admin/
Disallow: /readme.html
Allow: /wp-admin/admin-ajax.phpDisallow: /wp-content/plugins/
Allow: /wp-content/uploads/Disallow: /trackback/
Sitemap: https://visualmodo.com/post-sitemap.xml
Disallow: /go/
Sitemap: https://visualmodo.com/page-sitemap.xml
Google Webmaster Tools WordPress Robots Test
After updating your Robots.txt file, you have to test the Robots.txt file to check if any content is impacted by the update.
You can use Google Search Console to check if there is any “Error” or “Warning” for your Robots file. Just login to Google Search Console and select the site. Then go to Crawl > robots Test the er and click on the “Submit” button. A box will be popped up. Just click on the “Submit” button.
Finally, reload the page and check if the file is updated. It might take some time to update the Robots file. If it hasn’t updated yet, you can enter your Robots file code into the box to check if there are any errors or warnings. It will show the errors and warnings there. Save
If you notice any errors or warnings in the file, you have to fix it by editing the robots file. I hope this guide helps you to Learn in how to create, set up & improve your WordPress website Robots.txt file, optimizing it for SEO performance and ranking.