单个IP请求的具体限制
我正在开发一个应用程序,它可以获取所有信件的前 20 页。基本上这个时候限制已经没有问题了。但我需要知道每秒来自一个 IP 地址的请求的确切数量是多少?
此致,
I'm developing the application which gets the top 20 of pages from all letters. Basically, at this time there's no problem with limitation. But I need to know what's the exact number of requests from one IP address per second ?
Best regards,
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
每秒没有确切的数字。与任何其他网站一样,如果您执行的操作过多,您可能会因拒绝服务攻击而被阻止。如果你在一段时间内做得太多,Facebook 可能会屏蔽你,至少是暂时的。
如果您尝试抓取 Facebook,那么您应该像任何其他爬虫/蜘蛛一样遵守 robots.txt 文件中定义的规则。
https://www.facebook.com/robots.txt
http://www.facebook.com/apps/site_scraping_tos_terms.php
也就是说,我已经完成了大约 15 个当他们有配置文件框时,每天有数百万个更新请求。从来没有遇到过问题。
There is no exact number per second. Like any other site, if you do too many you will likely get blocked as a denial of service attack. If you are doing too many of an extended period of time, Facebook will likely block you, at least temporarily.
If you are trying to crawl Facebook, then you should obey the rules defined in their robots.txt file like any other crawler/spider should.
https://www.facebook.com/robots.txt
http://www.facebook.com/apps/site_scraping_tos_terms.php
That said, I've done around 15 million update requests per day back when they have profile boxes. Never had a problem.