我的 python 程序总是在运行几个小时后断开互联网连接,如何调试和解决这个问题?

发布于 2024-10-12 04:32:07 字数 861 浏览 3 评论 0原文

我正在编写一个Python脚本检查/监视多个服务器/网站的状态(响应时间和类似的东西),它是一个GUI程序,我使用单独的线程来检查不同的服务器/网站,每个线程的基本结构是使用无限while 循环每隔随机时间段(15 到 30 秒)请求该站点,一旦网站/服务器发生更改,每个线程将启动一个新线程来进行彻底检查(请求更多页面和类似的内容)。

问题是,在运行此脚本几个小时后,我的互联网连接总是被阻止/堵塞/混乱,情况是,从我的脚本端,每次请求页面时,我都会收到 urlopen 错误超时,并且从我的 FireFox 浏览器端我无法打开任何网站。但奇怪的是,当我关闭脚本时,我的互联网连接立即恢复正常,这意味着现在我可以通过浏览器浏览任何网站,所以它一定是脚本导致了所有问题。

我仔细检查了程序,甚至使用 del 删除使用过的任何连接,仍然遇到同样的问题。我只使用 urllib2、urllib、mechanize 来做网络请求。

有人知道为什么会发生这样的事情吗?我该如何调试这个问题?一旦出现这种情况,是否有工具或其他东西可以检查我的网络状态?这确实困扰了我一段时间...

顺便说一下,我使用的是 VPN,它与这个问题有关系吗?尽管我不这么认为,因为一旦脚本关闭,我的网络总是会恢复正常,并且在整个过程中 VPN 连接永远不会断开(正如它所显示的那样)。

[更新:]

刚刚找到有关此问题的更多信息,当我的程序断开互联网连接时,好吧,它并不是完全“断开”,我的意思是,我无法在浏览器中打开任何网站或始终无法打开任何网站get urlopen 错误超时,但我仍然可以在 cmd 行中使用“ping google.com”获得回复。当我手动断开 VPN 连接然后重拨时,无需关闭我的程序,它就会再次开始工作,而且我也可以通过浏览器上网。为什么会发生这种情况?

I'm writing a python script checking/monitoring several server/websites status(response time and similar stuff), it's a GUI program and I use separate thread to check different server/website, and the basic structure of each thread is using an infinite while loop to request that site every random time period(15 to 30 seconds), once there's changes in website/server each thread will start a new thread to do a thorough check(requesting more pages and similar stuff).

The problem is, my internet connection always got blocked/jammed/messed up after several hours running of this script, the situation is, from my script side I got urlopen error timed out each time it's requesting a page, and from my FireFox browser side I cannot open any site. But the weird thing is, the moment I close my script my Internet connection got back on immediately which means now I can surf any site through my browser, so it must be the script causing all the problem.

I've checked the program carefully and even use del to delete any connection once it's used, still get the same problem. I only use urllib2, urllib, mechanize to do network requests.

Anybody knows why such thing happens? How do I debug this problem? Is there a tool or something to check my network status once such situation occurs? It's really bugging me for a while...

By the way I'm behind a VPN, does it have something to do with this problem? Although I don't think so because my network always get back on once the script closed, and the VPN connection never drops(as it appears) during the whole process.

[Updates:]

Just found more info about this problem, when my program brings down the internet connection, well, it's not totally "down", I mean, I cannot open any site in my browser or always get urlopen error timed out, but I still can get reply using "ping google.com" in cmd line. And when I manually dropped the VPN connection then redial, without closing my program it starts to work again and also I can surf the net through my browser. Why this happening?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

爱本泡沫多脆弱 2024-10-19 04:32:07

这可能是问题,也可能不是问题,但在处理打开资源(如文件或 URL)的事物时,始终使用上下文管理器是个好主意。

从 Python 2.5 开始,您可以对文件执行此操作:

with open('/tmp/filename', 'rt') as infile:
    data = infile.read()
    whatever(data)

并且文件将在块末尾自动关闭。

urllib2 不自动支持这一点,但您可以使用 contextlib 来帮助您:

>>> import contextlib
>>> with contextlib.closing(urllib2.urlopen('http://www.python.org')) as page:
...   for line in page:
...     print(line)
<html> blablablabla</html>

这样连接将在 with 块末尾关闭并删除,因此您不必考虑它。 :-)

This may or may not be the problem but it's a good idea to always use context managers when dealing with things that opens resources, like files or urls.

Since Python 2.5 you can do this with files:

with open('/tmp/filename', 'rt') as infile:
    data = infile.read()
    whatever(data)

And the file will be automatically closed at the end of the block.

urllib2 doesn't support this automatically, but you can use contextlib to help you:

>>> import contextlib
>>> with contextlib.closing(urllib2.urlopen('http://www.python.org')) as page:
...   for line in page:
...     print(line)
<html> blablablabla</html>

This way the connection will be both closed and deleted at the end of the with-block, so you don't have to think about it. :-)

凡间太子 2024-10-19 04:32:07
  • 您创建的线程可能比您预期的多 - 监视 threading.active_count() 的结果来测试这一点。

  • 如果可能,请尝试排除您端的 VPN(或发布相关的代码内容,以便我们测试它)。

  • (Nettiquete) 如果您还没有这样做,请仅对每个受监控站点/主机使用network.http.max-connections-per-server 线程。< /p>

  • (仅供参考)urlopen 返回一个类文件对象 - 使用 .close()del在此对象上,否则套接字将处于 CLOSE_WAIT 状态,直到超时。

希望这些要点能够提供一些指导。

  • You could possibly be creating more threads than you expect - monitor the result of threading.active_count() to test this.

  • If possible try to rule out the VPN at your end (or post the relevant guts of the code so we can test it).

  • (Nettiquete) If you're not doing so already, only use network.http.max-connections-per-server threads per monitored site/host.

  • (For reference) urlopen returns a file-like object - use .close() or del on this object or the socket will be sat in a CLOSE_WAIT state until a timeout.

Hopefully these points are, well, pointers.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文