有没有办法在 nginx 中通过即时解压缩进行预压缩?

发布于 2024-08-05 18:55:15 字数 190 浏览 2 评论 0 原文

使用预压缩模块查找页面的预压缩 .gz 版本并将其提供给接受 gzip 的浏览器以避免即时压缩的开销很容易,但我想做的是从磁盘中消除未压缩版本并仅存储压缩版本,这显然会以相同的方式提供服务,但是如果不支持 gzip 的用户代理请求我希望 nginx 解压缩的页面,则在传输之前会即时进行它。

有没有人这样做过或者是否有其他高性能网络服务器提供此功能?

It's easy to use the pre-compression module to look for a pre-compressed .gz version of a page and serve it to browsers that accept gzip to avoid the overhead of on-the-fly compression, but what I would like to do is eliminate the uncompressed version from disk and store only the compressed version, which would obviously be served the same way, but then if a user-agent that does not support gzip requests the page I would like for nginx to uncompress is on the fly before transmitting it.

Has anyone done this or are there other high performance web servers that provide this functionality?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

叹倦 2024-08-12 18:55:15

在 Nginx 上发送静态预压缩 gzip 文件的最佳方法是使用 http_gzip_static_module。
更具体地说,在您需要的配置中:

gzip_static 始终;

http://nginx.org/en/docs/http/ngx_http_gzip_static_module.html

为了能够提供文件的解压缩版本,即您的服务器上只有 .gz 文件来保存 IO,您将需要使用 http_gunzip_module。在你的配置中它看起来像这样:

打开枪压缩;

http://nginx.org/en/docs/http/ngx_http_gunzip_module.html

您可能还对gunzip_module 页面底部的链接感兴趣。

PS 当预压缩文件时,我建议使用 Google 的 Zopfli 压缩算法,它会增加构建时间(而不是解压缩时间),但会减少文件大小约 5%。
https://code.google.com/p/zopfli/

The best way to send static pre-compressed gzipped files on Nginx is by using the http_gzip_static_module.
More specifically in the configuration you will want:

gzip_static always;

http://nginx.org/en/docs/http/ngx_http_gzip_static_module.html

To be able to serve up the unzipped version of the file i.e. you only have the .gz file on your server to save on IO, you will want to use the http_gunzip_module. In your config it looks like this:

gunzip on;

http://nginx.org/en/docs/http/ngx_http_gunzip_module.html

You may also be interested in the links at the bottom of the gunzip_module page.

P.S. When pre-compressing files I'd suggest using Google's Zopfli compression algorithm, it will increase the build time (not the decompression time), but decrease the file size by about 5%.
https://code.google.com/p/zopfli/

笑叹一世浮沉 2024-08-12 18:55:15

一种选择是使用 后备 后备 nginx.org/NginxHttpUpstreamModule" rel="nofollow noreferrer">上游 服务器解压缩文件,例如:

gzip_static on;
...
upstream decompresser {
    server localhost:8080; // script which will decompress the file
}

location / {
    try_files $uri @decompress;
}

location @decompress {
    proxy_pass http://decompresser;
}

另一种选择是使用 嵌入 perl 模块 作为后备模块而不是上游模块,但这可能会导致 nginx 阻塞,如果操作持续一段时间可能会降低性能。

使用上游模型,您可以通过使用系统默认的 XSendfile 模块>gzip 程序解压缩到 /tmp 目录中的文件。通过允许文件短暂停留,可以节省每个请求的解压开销。

One option is to have a fall-back upstream server to decompress the file, eg:

gzip_static on;
...
upstream decompresser {
    server localhost:8080; // script which will decompress the file
}

location / {
    try_files $uri @decompress;
}

location @decompress {
    proxy_pass http://decompresser;
}

Another option would be to use the embedded perl module as the fall-back rather than the upstream, however this can cause nginx to block and if the operation lasts a while could decrease performance.

With the upstream model you may be able to take advantage of nginx's XSendfile module by using the system's default gzip program to decompress to a file in the /tmp directory. This could save on decompression overhead per-request by allowing the file to hang around for a short while.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文