具有 keep-alive 功能的 Python 流 http 客户端
我需要一个可以重用连接并支持使用传入的流的 python http 客户端。它将用于解析 xml 流,sax 风格。
我想出了一个解决方案,但我不确定它是最好的解决方案(用 python 编写 http 客户端有很多方法)
class Downloader():
def __init__(self, host):
self.conn = httplib.HTTPConnection(host)
def get(self, url):
self.conn.request("GET", url)
resp = self.conn.getresponse()
while True:
data = resp.read(10)
if not data:
break
yield data
谢谢大家!
I need a python http client that can reuse connections and that supports consuming the stream as it comes in. It will be used to parse xml streams, sax style.
I came up with a solution, but I'm not sure it is the best one (there are quite a few ways of writing an http client in python)
class Downloader():
def __init__(self, host):
self.conn = httplib.HTTPConnection(host)
def get(self, url):
self.conn.request("GET", url)
resp = self.conn.getresponse()
while True:
data = resp.read(10)
if not data:
break
yield data
Thanks folks!
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
还有pycurl。默认情况下,keepalive 已打开,您可以写入文件以进行输出。
按照示例进行操作,它们非常有帮助
There is also pycurl. By default keepalive is turned on and you can write to a file for output.
Follow the examples, they are quite helpful
urlgrabber
支持 keepalive 并可以返回类似文件的对象。urlgrabber
supports keepalive and can return a file-like object.