python:使用 urllib、BaseHTTPSserver 下载文件时提供文件服务
我正在制作一个 RPM 文件缓存服务器。网络上的PC访问缓存服务器。如果服务器上存在该文件,则提供该文件。如果没有,则在提供服务之前从互联网下载。
我用 BaseHTTPServer 和 urllib 编写了这个来获取文件。现在,对于小文件,下载文件和提供文件之间几乎没有延迟。
...
store_file.write(download_buffer.read())
store_file.close()
...
f=open(file_path,'r')
self.wfile.write(f.read())
...
但某些文件可能需要几分钟才能下载。因此,当服务器完成文件时,客户端会一直等待。这可能会导致客户端超时。我们如何在下载文件时提供该文件以防止客户端超时?
I am making a RPM file cache server. A PC on the network accesses the cache server. If the file is present on the server, it is served. If not, it is downloaded from the internet before being served.
I wrote this with BaseHTTPServer with urllib to fetch the files. Now with small files, there is little delay between downloading the file and serving it.
...
store_file.write(download_buffer.read())
store_file.close()
...
f=open(file_path,'r')
self.wfile.write(f.read())
...
But some files may take minutes to download. So the client is kept waiting, while server finishes the file. This may cause client to time-out. How do we serve the file as it is being downloaded to prevent time-outs by the client?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
读写循环。
A read-write loop.