What is a Robots.txt File

robotOne of my customers was looking through her website statistics and she saw something called a robots.txt file.  “What the heck is it?” she asked.

Robots.txt is a file that search engines use. It tells search engines to ignore the contents of certain file folders. For example, you may not want your images listed on a search engine.  Robots.txt is also used to exclude hard-core computer code, like cgi files, from the search engines.

While it’s common to use a robots.txt file to try and keep certain files out of search engine listings, it’s not foolproof.  Some search engines ignore the robots.txt file.

Leave a Reply

Your email address will not be published. Required fields are marked *