In this article I will talk about Robots.txt file, how to edit Robots.txt file and how you can easily Optimize WordPress Robots.txt file. I will also explain how you can create Robots.txt file in case your website’s robots.txt is not available for some reason.
When you are optimizing your website for SEO, you will definitely want to cover most of the things that make your website’s SEO better. The Robots.txt file is one of those aspects which you should consider to optimize for your site’s SEO.
Whether it is WordPress or any other platform or a static website, an optimized Robots.txt file is an import factor for better SEO of website. When I say Optimized Robots.txt that means it should allow Search Engine bots to crawl and index important content of your site and at the same time it must disallow (block) Search Engine bots to access other parts of your site that are not intended to be indexed.
Table of Contents
What is Robots.txt file?
Robots.txt file is a simple text file in root directory of your WordPress setup. This file has user specified set of instructions for Search Engine Bots. These instructions help Search Engines (Google, Bing etc.) to understand that where they are allow-ed to go while vising your site. Robots.txt also tells Search Engines the places that are disallow-ed for their visit.
I will also talk about some certain things that you do not want Search Engines to crawl and index. Or in other words, you should not allow Search Engine bots to access vulnerable components of your website.
How to Create Robots.txt file?
(If your site has robots.txt file you can jump to next sections: How to edit it and optimize it.) Not sure if your site has one, continue reading:
When you set up WordPress using most popular 1-click install method, Robots.txt file got created in root directory of WordPress. Also for static websites robots.txt is stored in the root folder of the website.
You can check whether your site has rotots.txt file by visiting the following URL:
Replace yourdomain.com with your domain name and open above URL in browser. You should be prompted to the content of robots.txt file.
For example: at https://technumero.com/robots.txt you will find Robots.txt file of Technumero.com
If robots.txt is not available on your site then you will get 404 page or page not found error.
For some reason, if your WordPress site does not have robots.txt file. There is no need to worry about that, as you can easily create one.
Follow the steps below to create robots.txt file in WordPress:
- Create a text file in any text editor (like notepad) and paste the following basic instructions (syntax) in it and save the file as robots.txt on your computer.
- Go to the root directory 1 of your WordPress website using FTP client or via Hosting dashboard>File manager. (1 Here root directory is the folder on your web hosting where all the WordPress core files are stored.)
- Upload robots.txt file from your computer to root directory using FTP client or using File manager. You are done. You have successfully created a Robots.txt file.
How to Edit Robots.txt file in WordPress?
Before editing the robots.txt file, I would recommend you to take a backup of your file to avoid any hiccups.
You can edit WordPress robots.txt file by following methods:
- Edit robots.txt locally on computer with text editor like notepad and upload it to your site’s root directory.
- Use a specific plugin to edit Robots.txt file.
- Use Yoast SEO plugin.
- Directly edit it on Webhosting using cPanel File Manager.
- You can edit robots.txt file on your computer just like you edit any other text file. Open the file, make changes and save it. That’s it.
- However, It is advisable to avoid the use of plugin for small things, if you can. If you are the plugin person, you can install this plugin to edit robots.txt.
- Yes, you can use Yoast SEO plugin to edit robots.txt. In Yoast SEO settings go to Tools>File Editor>robots.txt file. From there you can edit it.
- Go to root directory of your WordPress using Hosting Dashboard (cPanel) >File manager. Right click on the robots.txt file then click edit, you will be prompted to file edit screen. From there you can make changes in robots.txt and save when you are done.
Now your site has robots.txt file and you know how to edit it. But you are still not clear what to write in it. The next section is all about content of the robots.txt file.
How to Optimize WordPress Robots.txt file
In this section, I will tell you how to use Robots.txt file in WordPress for Optimum SEO.
In the syntax of robots.txt file, there are few Crawler Directives which specify the action of a particular instruction. These crawler directives tell the Search Engine Bots that what they should do with different parts of your site.
List of some Crawler Directives used in Robots.txt:
User-agent: is used for bots (Search engine bots and ad bots etc).
Allow: is used to allow to crawl and index.
Disallow: is used to disallow (block) to index.
Sitemap: is used to add the sitemap URLs of your site.
There are few things which you should keep in your mind while editing Robots.txt file:
- You should specify the sitemap URLs of your site.
- Be careful while choosing a folder to Disallow. As it may affect your Search Engine appearance.
- You should Disallow cloaking link folder like /out or /recommends in your root directory, if you have one.
- You should Disallow html file in the root directory of your WordPress setup.
- As widely accepted practice you may also Disallow /wp-content/plugins/ folder as well to prevent crawling of unwanted plugin files.
- You should not use Robots.txt file to prevent crawling of your low-quality content. When I say content that particularly referring to the main content i.e. article, blog post, images etc. of your site. If you really want to noindex some of your content then do it with the appropriate method. For instance, you can use NoIndex tag provided in Yoast SEO plugin.
- You must not insert any comma, colon or any other symbol which is not a part of the instruction.
- Do not add extra space between instructions unless it is done for a reason.
WordPress Robots.txt example
Following is the robots.txt file of Technumero.com at the time of writing this article. You can use the following instructions in robots.txt file for your WordPress site with adequate changes like sitemap links etc.
User-agent: * Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /readme.html Disallow: /comments/feed/ Disallow: /trackback/ Disallow: /index.php Disallow: /xmlrpc.php Disallow: /wp-content/plugins/ Disallow: /feed/ Disallow: */feed/ Allow: /wp-content/uploads/ User-agent: Mediapartners-Google Allow: / User-agent: Googlebot-Image Allow: /wp-content/uploads/ User-agent: Googlebot Allow: /wp-content/uploads/ User-agent: Adsbot-Google Allow: / User-agent: Googlebot-Mobile Allow: / Sitemap: https://technumero.com/post-sitemap.xml Sitemap: https://technumero.com/page-sitemap.xml
Search Engine bots should only be allowed to crawl and index files which you can share publically. Above mentioned Robots.txt file disallow Search Engine bots to access core files of website, which are not required for public.
Robots.txt @ Popular WordPress Blogs
How do they do it?
Like many other WordPress things, popular blogs have different opinions on robots.txt file too. In one of Yoast’s article, they explained the use of the very simple robots.txt file. In another blog post, WPBeginner Team is suggesting the usage of robots.txt file with Allow and Disallow crawling directives for different parts of your site. Other examples of robots.txt file of popular blogs: Amit Agrawal of Labnol.Org, Neil Patel of QuickSprout.Com.
I have mentioned robots.txt file of Technumero.com above, you can use that on your WordPress site with necessary changes. You can also see robots.txt of some these popular blogs and can make your mind, how you want to keep your site’s robots.txt file.
Test if Robost.txt is Working Fine
Hey! Robots.txt, How are you doing?
You can test if your site’s Robots.txt file is working fine or not. Also, you can check if robots.txt is blocking a specific URL of your site.
Open Google Webmasters Tools Dashboard of your site. Then go to Crawl>robots.txt Tester. Clicking robots.txt Tester will take you to the screen like shown below.
From here you see the content of your robots.txt file. And you can also see if there is some error or warning with robots.txt file.
Right below that there is an option to enter a specific URL of your site to check if the robots.txt is blocking a URL. Using this option you can fetch a specific URL of your site as Googlebot, Googlebot-News, Googlebot-Image, Googlebot-Video, Googlebot-Mobile, Mediaparters-Google, Adbots-Google.
I hope this article helped you to understand Robots.txt file and to optimize WordPress Robtots.txt file. If you have a question or you just want to say hello, feel free to post it via comments below.
Learn more about WordPress Optimization:
How to Enable Gzip Compression in WordPress
How to Optimize WordPress Database Easily