ref: master
public/robots.txt
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 |
# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file # See also http://en.wikipedia.org/wiki/Robots_exclusion_standard for extensions User-agent: * # block searches' queries Disallow: /search Disallow: /tag Disallow: /search*query=* Disallow: /cat/* # The policy is to block any SEO/Marketing bot that's too aggressive # If you annoy our sysadmins you will be blocked. # Reach out to porcellis@eletrotupi.com for any questions. # Too aggressive, marketing/SEO User-agent: SemrushBot Disallow: / # Too aggressive, marketing/SEO User-agent: SemrushBot-SA Disallow: / # Marketing/SEO User-agent: AhrefsBot Disallow: / # Marketing/SEO User-agent: dotbot Disallow: / # Marketing/SEO User-agent: rogerbot Disallow: / # Too aggressive User-agent: bingbot Disallow: / User-agent: Bingbot Disallow: / # Too aggressive User-agent: Yandex Disallow: / # Too aggressive User-agent: "The Knowledge AI" Disallow: / # Too aggressive User-agent: "YandexBot" Disallow: / # Dickhead, too aggressive # Knowledge AI User-agent: "Knowledge" Disallow: / User-agent: msnbot Disallow: / User-agent: Purebot Disallow: / User-agent: Baiduspider Disallow: / User-agent: Lipperhey Disallow: / User-agent: Mail.Ru Disallow: / User-agent: scrapbot Disallow: / User-agent: MJ12bot Disallow: / User-agent: BDCbot Disallow: / User-agent: MegaIndex Disallow: / User-agent: UniLeipzigASV Disallow: / User-agent: DotBot Disallow: / User-agent: Typhoeus Disallow: / User-agent: PetalBot Disallow: / |