如果 url 太大,urllib 会冻结!
好的,我尝试使用 urllib 打开一个 url,但问题是文件太大,所以当我打开 url python 冻结时,我还使用 wxpython,当我打开 url 时它也会冻结 打开 url 时我的 cpu 几乎达到 100%
有什么解决方案吗? 有没有办法我可以分块打开网址,并且可能有一个 time.sleep(0.5) ,这样它就不会冻结? 这是我的代码:
f = open("hello.txt",'wb')
datatowrite = urllib.urlopen(link).read()
f.write(datatowrite)
f.close()
谢谢
ok im trying to open a url using urllib but the problem is that the file is too big, so when i open the url python freezes, im also using wxpython which also freezes when i open the url
my cpu goes to almost 100% when the url is opened
any solutions ?
is there a way i can open the url in chunks and maybe have a time.sleep(0.5) in there so it does not freeze ?
this is my code :
f = open("hello.txt",'wb')
datatowrite = urllib.urlopen(link).read()
f.write(datatowrite)
f.close()
Thanks
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您希望将下载拆分为单独的线程,以便您的 UI 线程继续工作,而下载线程单独完成工作。这样您在下载时就不会出现“冻结”情况。
在此处阅读有关线程的更多信息:
http://docs.python.org/library/threading.html
或者,您可以使用系统使用curl 或wget 在python 外部下载文件。
You want to split the download into a separate thread, so your UI thread continues to work while the download thread does the work separately. That way you don't get the "freeze" while the download happens.
Read more about threading here:
http://docs.python.org/library/threading.html
Alternatively, you could use the system to download the file outside of python using curl or wget.