通过 htaccess 或 PHP 阻止自动垃圾邮件机器人?
当我将其添加到我的 .htaccess 文件时,性能会受到什么影响:
如何使用 .htaccess 停止自动垃圾邮件机器人
,还是应该将其添加到我的 PHP 文件中?
或者完全忽略它?因为垃圾邮件发送者可能伪造他们的用户代理?
阻止用户通过代理服务器访问您的网站是否也有意义?我知道这也可能会阻止那些并非怀有恶意的人访问您的网站。但是,除了垃圾邮件或网站在其国家/地区被屏蔽之外,人们会通过代理服务器访问网站的原因还有哪些?
What there be a performance hit when I add this to my .htaccess file:
HOWTO stop automated spam-bots using .htaccess
or should I add it to my PHP file instead?
or leave it out completely? Because spammers might fake their useragent anyway?
Would it also make sense to prevent users from accessing your website via a proxy server? I know that this might also block people from accessing your website who didn't come here with bad intentions. But, what are some of the reasons why people would visit a website via a proxy server, other than spam, or when a website is blocked in their country?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
可能,如果您有数千或数万个用户代理字符串需要匹配。 Apache 必须对每个请求检查此规则。
Apache 对 .htaccess 的解析仍然比 PHP 进程更快。对于 PHP,Apache 必须为每个请求启动一个 PHP 解释器进程。
可能是的。大多数恶意垃圾邮件机器人很可能会伪造标准用户代理。
代理服务器有很多合法用途。一种是使用某种预取来节省移动流量的移动客户端。还有一些 ISP强制他们的客户使用他们的代理服务器。在我看来,锁定使用代理服务器的用户并不是明智之举。
最重要的是,这些事情可能并不值得担心,除非您有大量流量因恶意活动而浪费。
Possibly, if you have thousands or tens of thousands of user agent strings to match against. Apache has to check this rule on every request.
No Apache's parsing of .htaccess will still be quicker than a PHP process. For PHP, Apache has to start a PHP interpreter process for every request.
Probably yes. It is very likely that most malicious spam bots will be faking a standard user agent.
There is a lot of legitimate uses for a proxy server. One is mobile clients that use some sort of prefetching to save mobile traffic. There are also some ISPs who force their clients to use their proxy servers. In my opinion, locking out users who use a proxy server is not a wise move.
The bottom line is probably that these things are not worth worrying about unless you have a lot of traffic going to waste because of malicious activities.
与阻止相比,我个人会更注重保护网站的表单、代码、开放端口等基础知识。无论如何,访问很重要! ;)
I personally would focus more on securing the basics like forms, codes, open ports etc. of the website as compared to blocking. A visit counts anyway! ;)
...设置域名 com/bottrap、禁止通过 robots.txt 访问它、捕获顽皮的机器人、将其 IP 放入 .txt 数组、永远拒绝使用 403 标头访问它,有什么问题吗?
...whats wrong with setting up a domain dot com/bottrap, disallow access to it through robots.txt, capture the naughty bot, put its IP in .txt array, denying it access with a 403 header forever?
PHP 限制/阻止蜘蛛/机器人/客户端等的网站请求。
这里我编写了一个 PHP 函数,可以阻止不需要的请求以减少网站流量。蜘蛛、机器人和烦人的客户的上帝。
客户端/机器人拦截器
演示: http://szczepan.info/9-webdesign/php/1-php-limit-block-website-requests-for-spiders-bots-clients-etc.html
代码:
PHP Limit/Block Website requests for Spiders/Bots/Clients etc.
Here i have written a PHP function which can Block unwanted Requests to reduce your Website-Traffic. God for Spiders, Bots and annoying Clients.
CLIENT/Bots Blocker
DEMO: http://szczepan.info/9-webdesign/php/1-php-limit-block-website-requests-for-spiders-bots-clients-etc.html
CODE: