What is the robot.txt file explained

The robot.txt file is a file in your main (root) directory of your web site that is retrieved by web robots, such as search engine spiders, to see if you would like to exclude them from accessing any part of your web site. If you don't feel the need to restrict these web robots, it is best to have an empty robots.txt file in your root directory, so the failed accesses are not recorded in your web stats and hide the other reasons for failed access, such as accidentally deleted pages.
  • 72 Users Found This Useful
Was this answer helpful?

Related Articles

What is a web robot explained

A web robot is a program that automatically retrieves web pages by following the links on web...

Powered by WHMCompleteSolution