Tips To Use Blogger New Crawlers And Indexing Feature Effectively For Improved SEO
This post I'm going to show you how to use blogger new crawlers and indexing feature effectively for improve your blogs SEO.
Before start this guide, I like to briefly explain what robots.txt is. A robots.txt use for provide instructions to different search engine bots to, how to crawl your site contents, etc as well as provide a sitemap. For instance you can remove certain items on search result using this function as well as improve your site SEO.
Important - If you are unsure of any of the steps do not make any edits.
How to create a custom robots.txt on blogger
1) Log-in to your blogger account.2) Go to your blog.
3) Now go to "Settings>Search Preferences"
4) Under the "Crawlers and indexing" you can find "Custom robots.txt"
5) Click "Edit" link.
Now you can add different Crawler to instructions. This post I added different useful crawl robots.txt codes. First we identify how this code works.
User-agent: *
Disallow: /search
Allow: /
Above code,
User-agent: - Mention crawler name. For instance, Google bot.
Disallow: - Specify which pages should not crawl.
Allow: - Which pages crawl.
/ (slash):- indicate your home page.
Setup instructions for all robots
If you use above code, it will cause duplicate content issues and your site rank will be reduced. So that, you can use the following code. It will allow index and crawl entire blog but, not allow label and search pages.
User-agent: *
Allow: /
Block label and search pages crawling.
If you use above code, it will cause duplicate content issues and your site rank will be reduced. So that, you can use the following code. It will allow entire blog but, not allow crawl label and search pages.
User-agent: *
Disallow: /search
Allow: /
Block certain page (s).
Some reasons, you many need to hide your selected page or pages from the search engines. At that time you can use the following code.
User-agent: *
Disallow: /p/page-one.html
Allow: /
If you need to block more-than one page add their URL one by one on Disallow section like below.
User-agent: *
Disallow: /p/page-one.html
Disallow: /p/page-two.html
Allow: /
Allow all but block specific crawler.
If you want to block single crawler, you can add the following code.
User-agent: <bot name>
Disallow: /
User-agent: Googlebot-News
Disallow:
Setup AdSense crawler instruction.
To improve your Google AdSense performances; you can specify how AdSense bot crawl your site. Actually there is no need to block anything.
User-agent: Mediapartners-Google
Disallow:
Block Images indexing.
If you don't like to see your blog post’s images on the Google search result, you can remove them by using the following code.
User-agent: Googlebot-Image
Disallow: /
If you need to block any-other bot crawl your site, use following code. However you need to add selected bot name in “User-agent:” section.
User-agent: <required crawler name>
Disallow:
User-agent: *
Disallow: /
Adding site map.
Apart from above crawl instructions, you can add a sitemap. Normally blogger default sitemap provides 26 posts. So that you can add correct sitemap using "Custom robots.txt". This is an example of how to add a sitemap.
Sitemap: http://allabout4fun.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
Final result demo.
If you need to improve your blog's search engine visibility and crawl entire blog other than labels and search pages using the following code as your robots.txt.
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
If you find this article useful, please feel free to link to this page from your website or blog.
0 comments: