基于文件系统的压缩缓存——存在吗?
我正在寻找一个具有非常特定功能的程序,希望它存在,这样我就不必自己实现它。我可以最好地将这个假设的程序描述为工作目录中文档压缩副本的基于文件系统的缓存。我已经制作了一个我期望该程序执行的操作的模型:
james@pc:~/htdocs/$ tree -a
.
|-- image.png
`-- index.html
0 directories, 2 files
james@pc:~/htdocs/$ zipcache init
Initialized cache in ./.zipcache/
james@pc:~/htdocs/$ tree -a
.
|-- .zipcache
| |-- gzip
| | `-- index.html.gz
| `-- lzma
| `-- index.html.lzma
|-- image.png
`-- index.html
1 directory, 3 files
james@pc:~/htdocs/$ zipcache gzip index.html
... zipcache emits gzipped copy of index.html on stdout by cat-ing ./.zipcache/gzip/index.html.gz
james@pc:~/htdocs/$ zipcache lzma index.html
... zipcache emits lzma'd copy of index.html on stdout by cat-ing ./.zipcache/lzma/index.html.gz
james@pc:~/htdocs/$ zipcache lzma image.png
... zipcache generates error signifying cache miss (it's intelligent enough to know that PNG shouldn't be further zipped) ...
我最终关心的是缓存静态文件的压缩副本,这些静态文件在启用内容编码的情况下通过 HTTP 重复传输。我不想在每次请求文件时都计算压缩。
如果存在与上述模糊相似的东西,我仍然希望指出正确的方向——我的谷歌搜索非常不成功(也许上面的功能有我不知道的术语)。
I'm looking for a program with really quite specific functionality, which hopefully exists so I don't have to implement it myself. I can best describe this hypothetical program as a filesystem-based cache of compressed copies of documents in the working directory. I've made a mock-up of what I would expect this program to do:
james@pc:~/htdocs/$ tree -a
.
|-- image.png
`-- index.html
0 directories, 2 files
james@pc:~/htdocs/$ zipcache init
Initialized cache in ./.zipcache/
james@pc:~/htdocs/$ tree -a
.
|-- .zipcache
| |-- gzip
| | `-- index.html.gz
| `-- lzma
| `-- index.html.lzma
|-- image.png
`-- index.html
1 directory, 3 files
james@pc:~/htdocs/$ zipcache gzip index.html
... zipcache emits gzipped copy of index.html on stdout by cat-ing ./.zipcache/gzip/index.html.gz
james@pc:~/htdocs/$ zipcache lzma index.html
... zipcache emits lzma'd copy of index.html on stdout by cat-ing ./.zipcache/lzma/index.html.gz
james@pc:~/htdocs/$ zipcache lzma image.png
... zipcache generates error signifying cache miss (it's intelligent enough to know that PNG shouldn't be further zipped) ...
My ultimate concern is caching compressed copies of static files that are repeatedly transferred over HTTP with Content-encoding enabled. I have no desire to calculate compression every time a file is requested.
I would still appreciate pointing in the right direction if something vaguely similar to the above exists -- my Google searching has been quite unsuccessful (perhaps there is terminology for the above functionality that I don't know about).
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
[诚然,这只是回答“从这个开始”的问题]
看来你自己这样做是一个坏主意,你应该让网络服务器这样做。
我猜你在unix变体上使用apache,但为了完整性:
[Admittedly, this is answering the "one up from this" question instead]
It seems like doing this yourself is a bad idea and you should let the web server do this.
I'm guessing you're using apache on a unix variant, but for completeness:
我想您可以为此类缓存编写非常简单的 PHP 脚本。我不确定这样的事情是否已经存在
I guess you can write quite simple PHP script for such caching. I am not sure such thing exists already