Robots.txt is the configuration text file with couple of commands to take control over web content sharing with web crawlers. The robots exclusion protocol is created to instruct robots or web crawlers like google-bot and Bingbot for how to access and index pages from particular site. Generally, for personal website owner(i.e word press) needs to place this file in main or top level directory. Example: http://www.blogger.com/ robots.txt Read more about: Best Example ROBOTS.TXT Generated by google.com For blogger users, By default this files has been placed in root directory. To access it user needs to Enable it manually by change in settings. Steps to Enable Robots.txt file in blogger Step 1. Login to your account and go to settings . Step 2. Click on Edit and configure your commands. Step 3. Click on save changes Optimized ROBOTS.TXT file Wrong setting may affect your whole website or even may be ignored by search engines. So it is necessary ...
fresh knowledge center
Know more than your knowledge.
Know more than your knowledge.