How to edit robots.txt file in blogger ?

The Robots.txt is a text file which tells the search engine crawlers what not to do. Suppose you have some sensitive data in a page which you do not want the world to see then you can do so by providing some restriction in the robots.txt file and the search engine will not crawl that page.
In blogger, the robots.txt file can be viewed from the Google Webmasters Tools. In the webmasters tools, select your website and go to "Health". Under there, select "Blocked URLs".
There you will see the number of blocked urls that's been restricted by your robots.txt file.

Below you will see the robots.txt file. The default robots.txt looks something like this.

User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search
Allow: /

Sitemap: http://www.example.com/feeds/posts/default?orderby=updated

Mediapartners-Google is related with Adsense, so do not change anything with that or your ads will not fit with your content.
The line : Disallow: /search means that the search engines will not crawl your labels and archive. Those blocked urls that you see are the urls of  the labels and archives that has been restricted by the robots.txt.
To remove this restriction, go to your blogger account. Blogger has recently added a new feature which lets users add a custom robots.txt file.

Go to Settings --> Search Preferences --> Enable Custom Robots.txt

Now, if you want that search engines crawl everything in your blog, then add the follwing text in the Custom Robots.txt and save it.

User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow:
Allow: /

Sitemap: http://www.example.com/feeds/posts/default?orderby=updated
 
 
Note : After a few days, you will see that the robots.txt file has also changed in the Google 
Webmasters Tools. 
 When your labels are also crawled, then links like this will appear in the search engine. 
  To check which links have been crawled by the search engines, simply go to google.com
and type your site url like this  site:www.example.com 

Top