In the past, we have shown you how to create a custom robot.txt file in blogger. However, making and maintaining a correct robots.txt file is sometimes not only crucial but also difficult to master. If you have a site with large number of pages and directories than blocking individual URLs can be quite difficult for a non technical publisher. To make it simpler and easier Google provides an amazing robots.txt testing tool that does the job for you. Today in this article, we will show you How to test your robots.txt file with robots.txt tester in Blogger.
The robots.txt tester tool provides you detailed information whether your current robots.txt file is blocking Google search crawlers from accessing any specific URLs on your site. To make it simpler, you can use this tool to test whether Google bot crawlers can crawl the URL of a page that you wish to block from Google search engine.
The very first thing you need to do is to login to Google Webmaster tool, then go to Robots.txt tester and from the list of your verified properties select the one which you would like to test.
Now you will see your current robots.text file, you can test different URLs to see if google crawlers are disallowed from crawling them or not. Type a URL in the text box present at the bottom of the page and press Test button.