# $Id: robots.txt,v 1.7.2.3 2008/12/10 20:24:38 drumm Exp $ # # robots.txt # # This file is to prevent the crawling and indexing of certain parts # of your site by web crawlers and spiders run by sites like Yahoo! # and Google. By telling these "robots" where not to go on your site, # you save bandwidth and server resources. # # This file will be ignored unless it is at the root of your host: # Used: http://example.com/robots.txt # Ignored: http://example.com/site/robots.txt # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/wc/robots.html # # For syntax checking, see: # http://www.sxw.org.uk/computing/robots/check.html #Sitemap: http://ready2beat.com/sitemap.xml User-agent: * # Directories Disallow: /includes/ Disallow: /misc/ Disallow: /modules/ Disallow: /profiles/ Disallow: /scripts/ Disallow: /sites/ Disallow: /themes/ # Files Disallow: /cron.php Disallow: /install.php Disallow: /update.php Disallow: /xmlrpc.php # Paths (clean URLs) Disallow: /admin/ Disallow: /aggregator Disallow: /comment/reply/ Disallow: /comment/delete/ Disallow: /comment/edit/ Disallow: /contact/ Disallow: /logout/ Disallow: /node/add/ Disallow: /search/ Disallow: /user/register/ Disallow: /user/password/ Disallow: /user/login/ Disallow: /myajax # Paths (no clean URLs) Disallow: /?q=admin/ Disallow: /?q=aggregator Disallow: /?q=comment/reply/ Disallow: /?q=contact/ Disallow: /?q=logout/ Disallow: /?q=node/add/ Disallow: /?q=search/ Disallow: /?q=user/password/ Disallow: /?q=user/register/ Disallow: /?q=user/login/ # Additional Rules Disallow: /node/ Disallow: /node$ Disallow: /user$ Disallow: /*sort= Disallow: linkout Disallow: /search$ Disallow: /*/feed$ Disallow: /*/track$ User-agent: Mediapartners-Google* Disallow: