伪静态规则拦截/屏蔽/限制蜘蛛抓取 适用于apache、IIS

1 4,031

最近发现服务器 一些垃圾蜘蛛十分可恨,浪费占用大连服务器资源,根本不顾服务器的性能,有多大劲就使多劲,不停的抓取,而且无视Robots协议。不过可以用伪静态来实现屏蔽蜘蛛!下面介绍方法.

windows2008系统 IIS7、IIS7.5环境下 web.config,如果没有伪静态组件,得先安装伪静态组件

<rule name="Block spider">
<match url="(^robots.txt$)" ignoreCase="false" negate="true" />
<conditions>
<add input="{HTTP_USER_AGENT}" pattern="Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot" ignoreCase="true" />
</conditions>
<action type="CustomResponse" statusCode="403" statusReason="Forbidden" statusDescription="Forbidden" />
</rule>

Linux系统 apache环境下 规则文件.htaccess(如没有可手工创建.htaccess文件到站点根目录)加入以下代码:

<IfModule mod_rewrite.c>
RewriteEngine On
#Block spider
RewriteCond %{HTTP_USER_AGENT} "Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu" [NC]
RewriteRule !(^robots\.txt$) - [F]
</IfModule>

windows2003下 IIS6.0环境下 规则文件httpd.conf或者httpd.ini(在f服务器或者虚拟主机控制面板中用 “ISAPI筛选器自定义设置 " 开启自定义伪静态 Isapi_Rewite3.1或者免费版 )加入以下代码:

#Block spider
RewriteCond %{HTTP_USER_AGENT} (Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu) [NC]
RewriteRule !(^/robots.txt$) - [F]

注:规则中默认的是不明搜索蜘蛛,要屏蔽其它蜘蛛按规则添加替换即可

各大搜索蜘蛛名字:

google蜘蛛:googlebot

百度蜘蛛:baiduspider

yahoo蜘蛛:slurp

alexa蜘蛛:ia_archiver

msn蜘蛛:msnbot

bing蜘蛛:bingbot

altavista蜘蛛:scooter

lycos蜘蛛:lycos_spider_(t-rex)

alltheweb蜘蛛:fast-webcrawler

inktomi蜘蛛:slurp

有道蜘蛛:YodaoBot和OutfoxBot

热土蜘蛛:Adminrtspider

搜狗蜘蛛:sogou spider

SOSO蜘蛛:sosospider

360搜蜘蛛:360spider

 最后更新:2020-12-31
    • powercai

      这个教材不错。推荐一下。

    发表评论