Google crawler when visit your site, robots.txt file tells the crawler which site crawl or not crawl this file give instruction to search engine which directory are our site use for crawl on the search engine page and this file give instruction to Googlebot crawler, Robots.txt are very, for access this different page this file are very helpful before work on website url and know about the site information.
seo |
If search engine checks to see if a robots.txt
file exists bots follow the rules and follow allow and disallow web page, use
this robots.txt file, this file very useful for index any file on search engine
page, on search engine we don't want to index every page search engines to
index.
User-agent:
*
Disallow:
/folder1/
User-Agent:
Googlebot
Disallow:
/folder2/
Visit now http://www.covetus.com/seoservices.html for more details.
No comments:
Post a Comment