
One of the most boring topics in technical SEO is robots.txt. Rarely is there an interesting problem needing to be solved in the file, and most errors come from not understanding the directives or from typos. The general purpose of a robots.txt file is simply to suggest to crawlers where they can and cannot go. Basic parts of the robots.txt file User-agent — specifies which robot. Disallow — suggests the robots not crawl this area. Allow — allows robots to crawl this area. Crawl-delay — Continue Reading ...