# Robots.txt for crawler User-agent: * # Disallow Crawler Disallow: /User Disallow: /Dateien # Crawler often creates invalid script/webresource resource request Disallow: /ScriptResource Disallow: /WebResource # Max crawler Time per page in sec Crawl-Delay: 2 #Sitemap #Sitemap: http://www.gn-online.de/Sitemap_Index.xml.gz