IP 禁止 - 最有效的方法?

发布于 2024-10-15 13:54:01 字数 243 浏览 6 评论 0 原文

我经营一个大型论坛,和其他人一样,也遇到垃圾邮件发送者/机器人的问题。您可以以 htaccess 形式下载和使用大量已知垃圾邮件 IP 列表,但我唯一关心的是文件大小。所以我想问题是多大才算太大,因为它将为每个用户加载。添加所有 IP 地址大约为 100kb。

有没有一种替代方案可以减少开销?可能用 php 来做,或者由于文件大小和检查 ips 等也会导致一些重负载?

任何建议将不胜感激。

谢谢,

史蒂夫

I run a large forum and like everyone else have issues with spammers/bots. There are huge lists of known spam IP's that you can download and use in htaccess form, but my only concern is the file size. So I suppose the question is how big is too big, given it's going to be loading in for every user. Adding all the IP's in it gets to about 100kb.

Is there an alternative that would have less overhead? Possibly doing it with php, or will that result in some heavy load too due to file size and checking ips etc?

Any advice would be greatly appreciated.

Thanks,

Steve

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(8

赏烟花じ飞满天 2024-10-22 13:54:01

通常有比 IP 禁令更有效的方法。例如,表单中的隐藏字段只有机器人才能填写,或者需要 JavaScript 或 Cookie 来提交表单。

对于 IP 禁止,我不会使用 .htaccess 文件。根据您的网络服务器,它可能会读取每个请求的 htaccess 文件。我肯定会将 IP 禁令添加到您的网络服务器虚拟主机配置中。这样我就可以确保网络服务器将其保存在 RAM 中并且不会一次又一次地读取它。

通过 PHP 执行此操作也是一种选择。这样,您还可以轻松地将禁令限制在表单上,​​例如在论坛中注册。

There are often more efficient ways than IP bans. For example, hidden fields in a form only bots will fill out, or requiring javascript or cookies for submitting forms.

For IP banning, I wouldn’t use .htaccess files. Depending on your webserver it may read the htaccess files for each request. I’d definitely add the IP-bans into your webservers vhost configuration instead. That way I’d be sure the webserver will keep it in RAM and not read it again and again.

Doing it via PHP would also be an option. This way, you could also easily limit the bans to forms, like registration in your forum.

长安忆 2024-10-22 13:54:01

有几个选项:

  • 您可以将阻止列表存储到数据库中。在那里查询比在 PHP 中使用循环查询更有效。
  • 您可以使用 array_map(ip2long()) 预处理列表以节省内存和可能的查找时间。
  • 您可以将 IP 列表打包到正则表达式中,也许可以通过优化器 (Perl Regexp::Optimizer) 运行它。 PCRE 测试将再次比 foreach 和 strpos 测试更快。
    $regex = implode("|", array_map("preg_quote", file("ip.txt")));

但是,IP 阻止列表通常不太可靠。也许您应该实施其他两个解决方法:隐藏表单字段来检测愚蠢的机器人。或者用验证码来阻止非人类(不是很用户友好,但解决了问题)。

There are a few options:

  • You can store the block list into the database. It's more effecient to query there than with a loop in PHP.
  • You could pre-process the list with array_map(ip2long()) to save memory and possibly lookup time.
  • You could package the IP list into a regular expression, maybe run it though an optimizer (Perl Regexp::Optimizer). PCRE testing would again be faster than a foreach and strpos tests.
    $regex = implode("|", array_map("preg_quote", file("ip.txt")));

But then, IP block lists are not often very reliable. Maybe you should implement the other two workarounds: hidden form fields to detect dumb bots. Or captchas to block non-humans (not very user-friendly, but solves the problem).

ら栖息 2024-10-22 13:54:01

好吧,您正在构建一个地址数据库,对吧?使用数据库产品不是很有用吗?如果您还没有,SQLite 可以胜任这项任务。

Well, you are building a database of addresses, right? Wouldn't it be useful to use a database product for it? If you don't have any yet, SQLite could be up to the task.

莫言歌 2024-10-22 13:54:01

也许您想以传统的方式阻止垃圾邮件 - Captcha

我相信阿尔伯特·爱因斯坦先生曾经说过:问题不能以创造问题的同一水平的意识来解决:)

maybe you want to stop spam the good-ole-fashioned-way - Captcha ?

I believe that a Mr. Albert Einstein once said: Problems cannot be solved at the same level of awareness that created them :)

淡莣 2024-10-22 13:54:01

除非您的服务器负载已经出现问题,否则您可能不会注意到与 100K .htaccess 文件的区别。
可能有更快的替代方案,也许包括使用 iptables 或使用可以更快地搜索匹配项的排序 ip 列表,甚至使用数据库(尽管单个数据库查询的开销可能会破坏索引表的好处) )但除非您运行高负载的论坛,否则可能不值得付出努力。

您也可以尝试使用验证码或类似的方法。这个方向的一切都是有代价的,没有什么是 100% 可靠的。

Unless you already have problems with the load on your server you will probably not notice the difference from a 100K .htaccess file.
There may be faster alternatives, perhaps including the use of iptables or the use of sorted ip lists that can be searched faster for matches, or even the use of a database (though the overhead of a single database query might crush the benefit of indexed tables) but it is probably not worth the effort unless you run a forum with high loads.

You can alternatively try to use captcha's or similar. Everything in this direction comes at an expense and nothing is 100% reliable.

寄与心 2024-10-22 13:54:01

不要使用此类 IP 列表。它们可能会过时,并且您可能会阻止错误的请求。只需投资良好或更好的验证码,并且仅在 IP 确实进行某种拒绝服务攻击时才偶尔阻止它们。

Don't use such IP lists. They're likely to get outdated and you might block the wrong requests. Just invest in good or better captchas and only block IPs from time to time if they're really doing some kind of denial of service attack.

谎言 2024-10-22 13:54:01

为什么要强制网络服务器处理阻止用户?我建议使用空路由(因为如果被阻止的 IP 条目数量增加,使用 iptables 会减慢服务器速度)。

阅读 http://www.cyberciti.biz/tips/how-do-i-drop-or-block-attackers-ip-with-null-routes.html

http://php.net/manual/en/function.shell-exec.php

Why force the webserver to handle blocking users? I'd suggest using null routes (as using iptables will slow your server down if the amount of blocked IP entries grows).

Read up on http://www.cyberciti.biz/tips/how-do-i-drop-or-block-attackers-ip-with-null-routes.html

http://php.net/manual/en/function.shell-exec.php

旧人 2024-10-22 13:54:01

在 DocumentRoot 的 .htaccess 中,在:之后

Order Deny,Allow

追加一行:

Deny from <black ip>

In .htaccess in your DocumentRoot, after:

Order Deny,Allow

Append a line:

Deny from <black ip>
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文