最佳 JavaScript 包大小
当您向页面提供 JavaScript 时,最好提供一个打包的、缩小的和 gzip 压缩的文件,以减少延迟和请求时间。
是不是更好
- 但是,为您的整个网站发送一个大包裹,
- 为您网站中的每个页面发送一个大包裹,
- ?
CDN我不想从 CDN 加载
1. 您在初始加载时会加载更多内容,但随后您的所有 javascript 都会在您网站的整个访问过程中被缓存。
使用 2.,您只加载页面所需的内容,因此初始加载时间会减少,但您不会在网站的每个页面上缓存相同的文件。
哪种方法是首选?
我使用 node.js
来服务我的 JavaScript,并使用 ender
来打包我的 JavaScript。
编辑
换句话说,我正在考虑在我的包装上实现自动化。这种自动化要么将整个网站的所有内容打包到一个文件中,要么将页面特定的文件列表打包到每个页面的一个文件中。
我还没有关于我的 JavaScript 文件的任何统计信息,但我很好奇我应该实现这两个自动化中的哪一个。
When you serve JavaScript to a page it is best to serve one packaged, minified and gzipped file to reduce latency and request times.
But is it better to send
- One big package for your entire website
- One big package for each page in your website.
CDNI don't want to load from a CDN
With 1. you get load more on the initial load but then all your javascript is cached for the entire visit to your website.
With 2. you load only as much is necessary for a page so the initial load time is reduced but you don't have the same file cached on every page in your website.
Which method is preferred?
I'm using node.js
to serve my JavaScript and I'm using ender
to package my JavaScript.
Edit
Phrased differently I'm thinking of implementing an automation on my packaging. This automation will either package everything into one file for my entire website or package a page specific list of files into one file for each page.
I don't have any statistics on my JavaScript files yet but I was curious as to which of the two automations I should implement.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
方案一是成立的。
选项 2 是一个坏主意(由于您指定的原因)。
您缺少选项 4,即拥有一个大型核心包,并根据需要进行实例更新(包括辅助 javascript)...没有理由在每个页面上加载 Google 地图代码,因为您只需要在各处使用它, 例如。但也没有理由重新提供您的“核心”套餐。
这通常是我使用的选项。当速度非常重要时,我使用一个子域,该子域几乎从 apache 中剥离了所有内容(没有会话或 cookies、php 等)。实际上,我有一台服务器充当所有客户端的中央静态存储库,并在 DNS 中拥有一条额外的 A 名称记录,用于使用虚拟域的“静态”。
补充:为了响应您的编辑,我认为最合适的做法是拥有一个需要在自动化中组合在一起的文件列表。无需获取“所有内容”,只需获取“to_pacakge”数组中的所有项目即可。
Option 1 is tenable.
Option 2 is a bad idea (for the reasons you specify).
You're missing option 4, which is to have a one large core package, with instance updates (secondary javascript includes) as necessary... no reason to load your Google Maps code on every page, when you only need it here and there, for instance. But there's also no reason to re-serve your 'core' packages.
This is generally the option I use. When speed is super-important, I use a subdomain which has just about everything stripped out of the apache (no sessions or cookies, php, etc). I actually have one server which acts as a central static repository for all of my clients and an extra A Name record in the DNS for 'static' using virtual domains.
Added: In response to your edit, I think the most appropriate thing is to have a list of files that need to be globbed together in your automation. Instead of taking 'everything', just take all of the items in a 'to_pacakge' array.