基于文件系统的压缩缓存——存在吗?

发布于 2024-08-17 00:26:34 字数 1058 浏览 10 评论 0原文

我正在寻找一个具有非常特定功能的程序,希望它存在,这样我就不必自己实现它。我可以最好地将这个假设的程序描述为工作目录中文档压缩副本的基于文​​件系统的缓存。我已经制作了一个我期望该程序执行的操作的模型:

james@pc:~/htdocs/$ tree -a
.
|-- image.png
`-- index.html

0 directories, 2 files

james@pc:~/htdocs/$ zipcache init
Initialized cache in ./.zipcache/
james@pc:~/htdocs/$ tree -a
.
|-- .zipcache
|   |-- gzip
|   |   `-- index.html.gz
|   `-- lzma
|       `-- index.html.lzma
|-- image.png
`-- index.html

1 directory, 3 files
james@pc:~/htdocs/$ zipcache gzip index.html
... zipcache emits gzipped copy of index.html on stdout by cat-ing ./.zipcache/gzip/index.html.gz
james@pc:~/htdocs/$ zipcache lzma index.html
... zipcache emits lzma'd copy of index.html on stdout by cat-ing ./.zipcache/lzma/index.html.gz
james@pc:~/htdocs/$ zipcache lzma image.png
... zipcache generates error signifying cache miss (it's intelligent enough to know that PNG shouldn't be further zipped) ...

我最终关心的是缓存静态文件的压缩副本,这些静态文件在启用内容编码的情况下通过 HTTP 重复传输。我不想在每次请求文件时都计算压缩。

如果存在与上述模糊相似的东西,我仍然希望指出正确的方向——我的谷歌搜索非常不成功(也许上面的功能有我不知道的术语)。

I'm looking for a program with really quite specific functionality, which hopefully exists so I don't have to implement it myself. I can best describe this hypothetical program as a filesystem-based cache of compressed copies of documents in the working directory. I've made a mock-up of what I would expect this program to do:

james@pc:~/htdocs/$ tree -a
.
|-- image.png
`-- index.html

0 directories, 2 files

james@pc:~/htdocs/$ zipcache init
Initialized cache in ./.zipcache/
james@pc:~/htdocs/$ tree -a
.
|-- .zipcache
|   |-- gzip
|   |   `-- index.html.gz
|   `-- lzma
|       `-- index.html.lzma
|-- image.png
`-- index.html

1 directory, 3 files
james@pc:~/htdocs/$ zipcache gzip index.html
... zipcache emits gzipped copy of index.html on stdout by cat-ing ./.zipcache/gzip/index.html.gz
james@pc:~/htdocs/$ zipcache lzma index.html
... zipcache emits lzma'd copy of index.html on stdout by cat-ing ./.zipcache/lzma/index.html.gz
james@pc:~/htdocs/$ zipcache lzma image.png
... zipcache generates error signifying cache miss (it's intelligent enough to know that PNG shouldn't be further zipped) ...

My ultimate concern is caching compressed copies of static files that are repeatedly transferred over HTTP with Content-encoding enabled. I have no desire to calculate compression every time a file is requested.

I would still appreciate pointing in the right direction if something vaguely similar to the above exists -- my Google searching has been quite unsuccessful (perhaps there is terminology for the above functionality that I don't know about).

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

空气里的味道 2024-08-24 00:26:34

[诚然,这只是回答“从这个开始”的问题]

看来你自己这样做是一个坏主意,你应该让网络服务器这样做。

我猜你在unix变体上使用apache,但为了完整性:

[Admittedly, this is answering the "one up from this" question instead]

It seems like doing this yourself is a bad idea and you should let the web server do this.

I'm guessing you're using apache on a unix variant, but for completeness:

超可爱的懒熊 2024-08-24 00:26:34

我想您可以为此类缓存编写非常简单的 PHP 脚本。我不确定这样的事情是否已经存在

I guess you can write quite simple PHP script for such caching. I am not sure such thing exists already

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文