1 # Robots.txt: Configure which spiders can crawl this site
3 # Why is this server crawling my site?
4 User-agent: panscient_data_services.demarc.cogentco.com
8 User-agent: BaiDuSpider
22 User-agent: WISENutbot
33 # Allow all others not listed above
35 Disallow: /Backgrounds
57 Disallow: /blogs/Status