谁压缩他们的 HTML?

发布于 2024-08-24 08:12:03 字数 347 浏览 4 评论 0原文

即使 Stack Overflow 也不会压缩其 HTML。是否建议压缩 HTML?据我所知,它 看起来 Google 是唯一的....(查看源代码)。为什么这不是标准做法?

Even Stack Overflow doesn't compress their HTML. Is it recommended to compress HTML? As far as I've seen, it looks like Google is the only one.... (view the source). Why isn't this standard practice?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(5

ぺ禁宫浮华殁 2024-08-31 08:12:03

我认为您混淆了 HTML 源代码缩小和 GZIP 压缩。后者非常常见(例如在 Apache 上使用 mod_gzip,文章 此处),在大多数情况下应该足够了。它完全位于服务器和浏览器内部,您在源代码中看不到它。

实际的 HTML 缩小并不值得做,除非网站节省了一个字节就可以节省数万美元的流量(例如 Google)。

I think you are confusing the source code minification of HTML, and GZIP compression. The latter is quite common (for example using mod_gzipon Apache, article here) and should be enough in most cases. It is totally internal between the server and the browser, you can't see it in the source code.

Actual minification of the HTML is not really worth doing except for sites where a saved byte can mean tens of thousands of dollars in traffic savings (like for Google.)

内心激荡 2024-08-31 08:12:03

HTML 缩小对于 Stackoverflow 来说显然并不重要。我根据首页的 HTML 源代码做了一些测试。

Raw content length: 207454 bytes
Gzipped content length: 30915 bytes
Trimmed content length: 176354 bytes
Trimmed and gzipped content length: 29658 bytes

SO 已经使用了 GZIP 压缩,因此修剪空白(实际上是 HTML 压缩,或者您所说的“HTML 压缩”)将“仅”为每个响应节省 1KB 左右的带宽。对于每天浏览量超过 100 万的巨头来说,HTML 压缩每天可以节省超过 1GB 的带宽(实际上,SO 也可以节省这么多)。谷歌每天提供数十亿的页面浏览量,每一个字节的差异每天都会节省千兆字节。

FWIW,我使用这个简单快速的 Java 应用程序来测试它:

package com.stackoverflow.q2424952;

import java.io.BufferedReader;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.net.URL;
import java.util.zip.GZIPOutputStream;

public class Test {

    public static void main(String... args) throws IOException {
        InputStream input = new URL("http://stackoverflow.com").openStream();
        byte[] raw = raw(input);
        System.out.println("Raw content length: " + raw.length + " bytes");
        byte[] gzipped = gzip(new ByteArrayInputStream(raw));
        System.out.println("Gzipped content length: " + gzipped.length + " bytes");
        byte[] trimmed = trim(new ByteArrayInputStream(raw));
        System.out.println("Trimmed content length: " + trimmed.length + " bytes");
        byte[] trimmedAndGzipped = gzip(new ByteArrayInputStream(trimmed));
        System.out.println("Trimmed and gzipped content length: " + trimmedAndGzipped.length + " bytes");
    }

    public static byte[] raw(InputStream input) throws IOException {
        ByteArrayOutputStream output = new ByteArrayOutputStream();
        for (int data; (data = input.read()) != -1; output.write(data));
        input.close(); output.close(); return output.toByteArray();
    }

    public static byte[] gzip(InputStream input) throws IOException {
        ByteArrayOutputStream output = new ByteArrayOutputStream();
        GZIPOutputStream gzip = new GZIPOutputStream(output);
        for (int data; (data = input.read()) != -1; gzip.write(data));
        input.close(); gzip.close(); return output.toByteArray();
    }

    public static byte[] trim(InputStream input) throws IOException {
        ByteArrayOutputStream output = new ByteArrayOutputStream();
        BufferedReader reader = new BufferedReader(new InputStreamReader(input));
        for (String line; (line = reader.readLine()) != null;) output.write(line.trim().getBytes());
        reader.close(); output.close(); return output.toByteArray();
    }

}

HTML minification apprently doesn't matter that much for Stackoverflow. I did a little test based on the HTML source of the frontpage.

Raw content length: 207454 bytes
Gzipped content length: 30915 bytes
Trimmed content length: 176354 bytes
Trimmed and gzipped content length: 29658 bytes

SO already uses GZIP compression, so trimming whitespace (actually, HTML minification, or "HTML compression" as you call it) would save "only" around 1KB of bandwidth per response. For gigants with over 1 million pageviews per day HTML minification would already save over 1GB of bandwidth per day (actually, SO would save that much as well). Google serves billions of pageviews per day and every byte of difference would save gigabytes per day.

FWIW, I used this simple quick'n'dirty Java application to test it:

package com.stackoverflow.q2424952;

import java.io.BufferedReader;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.net.URL;
import java.util.zip.GZIPOutputStream;

public class Test {

    public static void main(String... args) throws IOException {
        InputStream input = new URL("http://stackoverflow.com").openStream();
        byte[] raw = raw(input);
        System.out.println("Raw content length: " + raw.length + " bytes");
        byte[] gzipped = gzip(new ByteArrayInputStream(raw));
        System.out.println("Gzipped content length: " + gzipped.length + " bytes");
        byte[] trimmed = trim(new ByteArrayInputStream(raw));
        System.out.println("Trimmed content length: " + trimmed.length + " bytes");
        byte[] trimmedAndGzipped = gzip(new ByteArrayInputStream(trimmed));
        System.out.println("Trimmed and gzipped content length: " + trimmedAndGzipped.length + " bytes");
    }

    public static byte[] raw(InputStream input) throws IOException {
        ByteArrayOutputStream output = new ByteArrayOutputStream();
        for (int data; (data = input.read()) != -1; output.write(data));
        input.close(); output.close(); return output.toByteArray();
    }

    public static byte[] gzip(InputStream input) throws IOException {
        ByteArrayOutputStream output = new ByteArrayOutputStream();
        GZIPOutputStream gzip = new GZIPOutputStream(output);
        for (int data; (data = input.read()) != -1; gzip.write(data));
        input.close(); gzip.close(); return output.toByteArray();
    }

    public static byte[] trim(InputStream input) throws IOException {
        ByteArrayOutputStream output = new ByteArrayOutputStream();
        BufferedReader reader = new BufferedReader(new InputStreamReader(input));
        for (String line; (line = reader.readLine()) != null;) output.write(line.trim().getBytes());
        reader.close(); output.close(); return output.toByteArray();
    }

}
慕烟庭风 2024-08-31 08:12:03

不缩小代码的另一个好理由是为了学习。我喜欢浏览人们的源代码以了解他们如何解决问题的能力,同样,我会保留完整的源代码,以便其他人可以查看我的源代码。在将代码发送到浏览器之前,我仍然通过 gzip 对其进行压缩,但是当代码到达时,它将被解压缩为完整形式并且完全可读。

Another good reason not to minify your code is for learning. I love the ability to go and look through people's source code to see how they solve problems, and likewise I keep my source in full form so others can look at mine. I still have my code compressed through gzip before it's sent to the browser, but when it arrives it will be uncompressed into full form and fully readable.

倦话 2024-08-31 08:12:03

我想很少有人这么做。工作太多,收获太少,尤其是现在 HTTP 上的有效负载可以进行 zip 压缩。

I think few people do. Too much work, too little gain, esepcailly as the payload on HTTP can be zip-compressed these days.

夜访吸血鬼 2024-08-31 08:12:03

Gzip 压缩,每个现代 Web 服务器和 Web 服务器都使 HTML 压缩(缩小)毫无用处或几乎微不足道。

所以很少有人用这个。

Gzip compression that every modern web server and web server make HTML compression (minification) useless or almost insignificant.

So very few use this.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文