How to Create Enable Custom Robots txt

 

How to Create and Enable Custom Robots.txt File in Blogger

How to Create Enable Custom Robots txt


Robot.txt is a way which tells Search Engines whether they are allowed to index a page in the search result or not allowed. The bots are automatic, and before they could access your site, they check the Robot.txt file to check whether they are allowed to crawl this page of website or not.   Mostly, people do not want to stop searching engine from crawling their whole website. But some of Them want to specify few pages, which they should not want to be indexed in the search results. Therefore, today with this article, we will show you How to Create and Enable Custom Robot.txt File in Blogger?

Step 1 :
Go to blogger  Blogger.com >> your site >>dashboard >> Setting >> Search Preference. Click on Custom robots.txt. Now click Edit link. Below look like this.





Step 2 :
Select Yes. Copy and Paste the following code in the box provided. Now click on Save changes button.


Step 3 :

Now only replace “yourblogname.com” with your own blog’s homepage URL.


Brief in Detail:-

Custom robots.txt:

This option provides you the ability to edit your whole robot.txt file. You can just type your content whether you want the content to be crawled by spiders or not. However,You can always undo your changing and can always revert back to old one.

Custom robots header tags:

This option is almost completed. It does not provide you permission to write your codes instead it provides few options with the check boxes. So, if have no idea about head tags then stay away from it. 

Now you have to enable the custom robot.txt file so click on the edit button which is present next to the “Custom robots.txt” option. After selecting the edit button, it would ask you to enable custom robots.txt content so now click on “Yes” and proceed to the next step.






In the text area, type the content which you like to exclude from being crawled. After editing this file according to your needs, click on the save button to conclude. However, if you want to revert back to the default robot file then, you can click on “No” instead of “Yes.


For Example :


Mediapartners-Google :– 

Media partner Google is the user agent for Google AdSense that is used to server better relevant ads on your site related with your content. So if you disallow this they you won’t be able to see any ads on your blocked pages.

User-agent:-

 User agent is calling the robot. is for all the search engine’s robots like Google, Yahoo and Bing etc.

Disallow:-

 This line tells the search engine’s crawler not to crawl these pages.

Allow – Allow:-

simply refers to or you are specifically allowing search engines to always crawl those pages.


We hope this tip will help you in future. If you are having any issue regarding crawling then, feel free to leave your questions under comments and our experts would try to help you in solving these problems


No comments:
Write Comments