When you should optimize your robots.txt file?
And What is the correct way to optimize it?
Everything I will tell you in this article.
So, chill and sit back.
Let’s face it:
Whenever you talk about SEO of WordPress websites, then robots. txt file plays a major role in it.
And it also affects your rankings in search engines.
If you have optimized the robots.txt file of your website then your website will rank high in less time.
But if you haven’t optimized your robots.txt file then it will be difficult to rank higher in less time or maybe it won’t rank.
Before starting it I strongly recommend you to read this article, then you will be able to understand it easily.
So, let’ finish this quickly and easily:
It must be well optimized and should not block access to private pages or parts.
And there are many problems with indexing and no-indexing of content.
But don’t worry I will also solve those problems.
SEO has hundreds of parts but the basic part of SEO is robots.txt.
This small text file has a home on your website that can help in serious optimization of your website.
Many of the Webmasters avoid to edit Robots.txt file, but it’s not that much hard.
Anyone with basic knowledge can create or edit Robots.txt file.
And if you are new to this then you at the right place and this post is perfect for you.
If your website doesn’t contain a Robots.txt file then you can learn here.
If your website contains Robots.txt file but it is not optimized then it is of no use.
Then you must follow my article and optimize your Robots.txt file fast and easy and skyrocket your website.
What is WordPress Robots.txt, why you should use it
Robots.txt file helps bots and directs them which part of your website they have to crawl and which part to avoid.
When a bot comes to your site and wants to index your site then they follow the Robots.txt file first.
The bot follows the file directions for indexing or not indexing pages of your website.
If you use WordPress, you Robots.txt file will be in the root of your WordPress installation(cPannel). For static websites,
If you or your developer has created other robots.txt file then you will get it in your root folder.
Anyway, you didn’t get then simply create a new notepad file and name it as Robots.txt and upload it into the root directory of your domain using the FTP.
But How Can You Actually make a robots.txt file?
Here’s the deal:
As I specified above, robots.txt is a normal text file.
But wait if you don’t have that file, then??
I will teach you to step- by- step
If you don’t have a robots.txt file on your website then open any text file(notepad) and create a new one with one or more records.
Every record of your website carries important information for the search engine. Example:
If these lines are written in the Robots.txt file, it means these allows the bot to index every page of your site.
But folder of root directory doesn’t allow it for indexing.
That means bot won’t index folder.
By using Disallow option, you can stop any bot from indexing a page or folder.
There are many sites which use no index in their archive folder or page for not making duplicate content.
But where Can You Get names of bots?
You can get the names of bots in your website’s log, but if you want a bulk of visitors from the search engine.
Then you need to” allow” every bot.
Which means every bot will index your website.
You can write to allow every bot. For example:
Like this, every bot will index your website.
But wait there’s a catch
What You not to do
1. Don’t use comments in Robots.txt file.
2. Don’t keep the space at the beginning of any line and don’t make regular space in the file. Example:
Dis allow: /support
3. Don’t dare to change rules of command.
4. If you do not want to index more than one directory or page, don’t write with these names:
Disallow: /support /cgi-bin /images/
5. Use capital and small letters properly. Example, if you want to index “Download” directory but write “download” on Robots.txt file, it mistakes it for a bot.
6. If you want to index all pages and directory of your site then write:
7. If you want no- index for all pages and directories of your site write:
After editing the file, upload it with any FTP software on Root or Home Directory of your site.
Robots.Txt for WordPress:
You can either edit your Robots.txt file after logging into your FTP account of the server or you can use plugins like Robots meta to edit the file.
A few things which you ought to add in your Robots.txt file with your sitemap URL. Adding sitemap URL helps bots to find your sitemap file and it results in faster indexing of pages.
Be confirmed no content is affected by new Robots.txt file
So as you have made some changes to your Robots.txt file, and now it’s time to check if any of your content has got any harm due to the update in the file.
You can use Google search console’s feature which is ‘Fetch as Google tool’ to see if your content can be reached by Robots.txt or not.
These steps are easy.
Visit Google search console> select your site> go to diagnostic and Fetch as Google.
Add some posts your site and check if there is any issue while obtaining your post or not.
Now, You can also check for the crawl errors which are caused by the Robots.txt file under Crawl error section of search console.
Click on Crawl > Crawl Error> select Restricted by Robots.txt and you will see which of the links have been denied by the Robots.txt file.
So, Here you can see the example of Robots.txt Crawl Error:
In this picture, you can clearly see that Replytocom links have been rejected by Robots.txt and have other links which should not be a part of Google.
FYI, Robots.txt file is a crucial part of SEO, and you can avoid many post duplication issues by updating your Robots.txt file.
Now I want to hear from you:
Do you use Robots.txt to optimize your site?
Let me know by dropping a message in the comment section below.
And don’t forget to subscribe to our e-mail newsletter to keep receiving more SEO tips.