# # robots.txt # # This file is to prevent the crawling and indexing of certain parts # of your site by web crawlers and spiders run by sites like Yahoo! # and Google. By telling these "robots" where not to go on your site, # you save bandwidth and server resources. # # This file will be ignored unless it is at the root of your host: # Used: http://example.com/robots.txt # Ignored: http://example.com/site/robots.txt # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/robotstxt.html User-agent: MJ12bot Disallow: / User-agent: SemrushBot Disallow: / User-agent: SemanticScholarBot Disallow: / User-agent: PetalBot Disallow: / #Baiduspider User-agent: Baiduspider Disallow: / #Yandex User-agent: Yandex Disallow: / User-Agent: trendictionbot Disallow: / User-agent: BLEXBot Disallow: / User-agent: GoogleBot Crawl-delay: 10 # CSS, JS, Images allow: /$ allow: */Category/Index allow: */Category/List allow: */Category/ListSubcategories allow: */Product/Static allow: */Product/Template allow: */Intranet/ allow: */c/ allow: */p/ Disallow: / User-agent: facebookexternalhit Allow: / User-agent: LinkedInBot Allow: / User-agent: * Crawl-delay: 10 Disallow: /