How to Optimize Your WordPress Robots.txt (Advanced Guides)

Many of you may have heard of the robots.txt file but do you know where it is and how it looks? Most importantly, do you know how to optimize your WordPress robot.txt file to boost SEO?

This article will cover these questions to give you a correct and clear picture of the WordPress robots.txt file. From there, you can make use of the file to increase your online visibility on SERPs or hide your sensitive content from the public eye.

We’ll start with the robots.txt location, its examples, highlight ways to edit the file, then wrap up this post with advanced tips on WordPress robots.txt optimization.

Let’s dive in!

Where is the WordPress robots.txt Location?

The robots.txt file resides in the root folder of your website. Its main duty is to guide search engine bots on how to crawl and index your site. In particular, it denies search engine access to certain files and folders.

Many WordPress site owners make use of the robots.txt file to discourage search indexing. That sounds counterintuitive, right? Considering the fact that SEO or winning the 1st position on SERPs plays a critical component in online business success.

Still, if your site is under the development mode, or contains some low-quality or personal content, robot.txt proves handy in block crawling and indexing to those content. From here, you can optimize your WordPress robots.txt file to prioritize your top landing pages like sales pages, product pages, or sponsored posts.

On the same note, you can also optimize your crawl quota. Crawl quota, or craw budget, refer to the number of pages on your WordPress site that Google bots crawl at any given time. Allowing your important pages to crawl budget help quickly get them indexed and ranked on the 1st page of search results.

However, be mindful of optimizing the robots.txt file since it can harm your SEO if done poorly.

WordPress Robots.txt Example

You can see the robots.txt file of any website by adding /robots.txt at the end of the domain name. So a robots.txt file will look something like this.

wordpress robots.txt file

The user-agent in a robots.txt file is the search engine that reads the robots.txt file. If In the user-agent is marked with an asterisk, this means it gives the green light to all search engines.

In robots.txt files, allow and disallow tell the bots which pages and content they can and cannot crawl. As you can see, we have allowed search engines to crawl and index files in our WordPress admin-ajax and disallowed access to plugins and admin folders.

A site map is an XML file that contains a list and details of all the pages on your website.

How to Edit Robots.txt in WordPress?

The easiest way to edit robots.txt in WordPress is using plugins. There are several robust SEO plugins excelling at helping you edit the robots.txt file, All in one SEO, YoastSEO, RankMath, etc. just to name a few.

Apart from that, you can also seek help from plugins that are solely developed to optimize your WordPress robots.txt file such as Virtual Robots.txt, WordPress Robots.txt optimization, and many more.

optimize your wordpress robots.txt file

Alternatively, it’s possible for you to edit the file manually via FTP Client. First, you need to connect your WordPress hosting provider using an FTP client. After that, locate the robots.txt file in the root directory of your website.

edit robots.txt wordpress

If you don’t see the robot.txt file there, chances are your site doesn’t have it. Don’t freak out, just create a new one. Hit right-click and choose “Create new file” then download it to your desktop. Robots.txt is a plain text file, meaning you can download and edit it as normal as other files with text editors like Wordpad or Notepad.
create robot.txt file wordpress

After that, update your changes, upload your edited robots.txt file to the root folder again and you’re done.

Test Your Robots.txt File

You need to make sure your robot.txt file functions as normal after the edition. If there are any mistakes occur, it can lead to the consequence that your website gets excluded from search result pages.

Among a wide range of robots.txt testing tools out there, we recommend you to use Google Search Console, letting you test your robots.txt file for free.

Make sure that you already submitted your website URL to Google Search Console, so that the tool verifies your authentication. If not, check out our guide on how to submit your site to search engines.

Then you’re good to use the Google Search Console Robots Testing Tool. Once you select a property in the dropdown menu, the tool right off the bat goes through the file then notifies you if there are any errors and warnings.

google search console robots testing tool

More than that, the tool allows you to enter a certain page on your site then select a specific user agent to check whether that page is indexing blocked or indexing accepted.

It’s possible for you to edit the file directly on the tool and re-run the test. However, the actual file still remains unchanged. You may need to copy the edition and paste on the actual file and save it there.

robots.txt tester

How to Optimize Your WordPress Robots.txt File for SEO

It’s highly recommended to selectively disallow some of your sensitive pages, such as /wp-admin/, /wp-content/plugins/, /trackback/, and /readme.html, while the rest remain “allow.”

What’s more, you can optimize your WordPress robots.txt file by adding your sitemap to it. While WordPress provides its own sitemap, you can use SEO plugins like YoastSEO or All in One SEO to build your custom one. They give you the power to create separate sitemaps for Posts and Pages.

At that point, adding these sitemaps to your robots.txt file will boost bots crawl more quickly and easily.
add sitemap to wordpress robots.txt

On top of that, despite the idea of optimization, it’s strongly advised not to over-touch or over-edit the robots.txt file. This is because changing too much in the file can hurt your page structure and miss-inform bots on preventing crawling your site.

So, less is more. You just need to focus on your sitemap and get your top pages discovered first.

Boost Your Online Visibility Now!

Robots.txt is an important file enabling you to strategically optimize your crawl quota and hide private content from search indexing.

This article has shown you what is a robots.txt file, where to find it, as well as how to edit the file. We have also presented you with how to optimize your WordPress robots.txt file for SEO. The thing is, the less edition in robots.txt, the better. And don’t forget to add a custom sitemap to the file to boost your key page visibility.

What are you waiting for? Boost your online visibility with the robot.txt file now!