| Current Path : /home/zqegovsj/www/us3web.haibo.com.cn/laundrypodsac/duoyy-web/ |
| Current File : /home/zqegovsj/www/us3web.haibo.com.cn/laundrypodsac/duoyy-web/robots.txt |
# created by Mark Chan # <URL:http://www.robotstxt.org/wc/exclusion.html#robotstxt> # Format is: # User-agent: <name of spider> # Disallow: <nothing> | <path> User-agent: facebook Disallow: / User-agent: FacebookBot Disallow: / User-agent: bytedance Disallow: / User-agent: bytedanceBot Disallow: / User-agent: DataForSeoBot Disallow: / User-agent: aboveBot Disallow: / User-agent: above Disallow: / User-agent: dataforseo Disallow: / User-agent: above-bot Disallow: / User-agent: dataforseo-bot Disallow: / User-agent: SemrushBot Disallow: / User-agent: SemrushBot-SA Disallow: / User-agent: SemrushBot-BA Disallow: / User-agent: SemrushBot-SI Disallow: / User-agent: SemrushBot-SWA Disallow: / User-agent: SemrushBot-CT Disallow: / User-agent: SemrushBot-BM Disallow: / User-agent: AhrefsBot Disallow: / User-agent: BLEXBot Disallow: / User-agent: DotBot Disallow: / User-agent: ZoominfoBot Disallow: / User-agent: Exabot Disallow: / User-agent: ImagesiftBot Disallow: / User-agent: Imagesift Bot Disallow: /