从 Google 的 CDN 提供 jQuery UI 还是作为本地副本?
虽然最好从 Google 的 CDN 提供 jQuery,但 jQuery UI 却是一个不同的野兽。我的本地修改副本重 60kb,而 Google CDN 中的副本重约 200kb。
- 有多少网站使用 CDN 的数据吗? (阅读:有多少用户在缓存中拥有它)。我如何知道/计算在本地提供服务是否更好?
while it's better to serve jQuery from Google's CDN jQuery UI is a different beast. My local modified copy weighs 60kb and the one in Google's CDN ~200kb.
- Are there any numbers on how many sites uses the CDN? (read: how many users have it in their cache). How do I know/calculate if it's better to serve it locally?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
![扫码二维码加入Web技术交流群](/public/img/jiaqun_03.jpg)
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(5)
虽然来晚了,但考虑到 gzip 压缩,您基本上是在比较从 Google CDN 下载的约 51k(197.14k 内容变成在线的 51.30k)与从您自己的服务器下载的约 15.5k(假设您的 60k 文件 gzip 与完整 jQuery UI 文件的比率相同,并且您启用了 gzip 压缩)。这将我们带入一个复杂的领域:
你的问题的答案是一个很大的问题:这取决于,尝试每一个并在实际中测量结果世界场景。
预先存在的缓存副本
如果您网站的首次访问者之前曾使用来自 Google CDN 的 jQuery UI 访问过某个网站,并且该内容仍在缓存中,那么这将是轻而易举的事。句号。无需再考虑了。 Google 使用适当的缓存标头,浏览器甚至不必将请求发送到服务器,前提是您链接到 jQuery UI 的完全指定版本(不是“1.8 的任何版本”之一) .x 就可以”URL——如果您要求 jQuery UI 1.8.16,Google 将返回最多可缓存一年的资源,但如果您请求 jQuery UI 1.8.x [例如,任何 1.8 的点版本],该资源仅适用于一个小时)。
但假设他们还没有...
延迟和传输时间
延迟是建立与服务器的连接所需的时间,传输时间是传输资源实际花费的时间。使用我的 DSL 连接(我离我的交换机不是很近,所以我通常会获得大约 4Mbit 的下载吞吐量;例如,这是一个不错的连接,但与伦敦人或美国那些幸运的 FiOS 人所获得的不同),我通常会反复进行下载Google 的 jQuery UI 副本的实验大约 50 毫秒等待连接(延迟),然后 73 毫秒进行数据传输(SSL 会更改该配置文件,但我假设这里是非 SSL 站点)。与下载 Google 的 jQuery 本身 副本(89.52k gzip 到31.74k),具有相同的约 50 毫秒延迟,随后有约 45 毫秒的下载时间。请注意,下载时间与资源大小成正比(31.74k / 51.30k = 0.61871345,果然,73ms x 0.61871345 = 45ms),但延迟是恒定的。因此,假设您的副本速度为 15.5k,您可以预期(对我来说)50 毫秒的延迟加上大约 22 毫秒的实际下载时间。在所有其他条件相同的情况下,通过托管您自己的 60k 副本与 Google 的 200k 副本,您将为我节省高达 52 毫秒的时间。这么说吧,我不会注意到其中的差异。
然而,一切并不平等。 Google 的 CDN 高度优化,具有位置感知能力,而且速度非常快。例如,我们来比较一下从 Heroku.com 下载 jQuery。我选择他们是因为他们都是聪明人,经营着重要的托管业务(目前使用 AWS 堆栈),因此您可以预期他们至少花费了一些时间来优化静态内容的交付 —他们碰巧在他们的网站上使用了 jQuery 的本地副本;他们在美国(稍后你就会明白为什么)。如果我从他们那里下载 jQuery(令人震惊的是,他们似乎没有启用 gzip!),延迟始终在 135 毫秒范围内(偶尔有异常值)。这始终是 Google CDN 延迟的 2.7 倍(而且我的吞吐量也更慢,大约是速度的一半;也许他们只在美国使用 AWS 实例,而且由于我在英国,所以距离更远)来自他们)。
这里的要点是,延迟很可能会消除您从较小的文件大小中获得的任何好处。
请求数量
如果您有任何 JavaScript 文件要在本地托管,您的用户仍然需要获取这些文件。假设您的网站有 100k 个自己的脚本。如果您使用 Google 的 CDN,您的用户必须从 Google 获得 200k 的 jQuery UI 以及从您那里获得 100k 的脚本。浏览器可能会并行放置这些请求(除非您在
script
标记上使用async
或defer
,浏览器必须执行< /em> 脚本按严格的文档顺序排列,但这并不意味着它不能并行下载它们)。或者很可能不会。正如我们已经确定的那样,对于非移动用户,在这些大小下,实际数据传输时间并不那么重要,您可能会发现采用本地 jQuery UI 文件并将其与您自己的脚本结合起来,因此只需要一个尽管 Google CDN 很好,但下载而不是两次可能会更有效。
这是旧的“最多一个 HTML 文件、一个 CSS 文件和一个 JavaScript 文件”规则。最小化 HTTP 请求是一件好事TM。同样,如果您可以使用精灵而不是单个图像来处理各种事情,这有助于减少图像请求。
正确的缓存头
如果您托管自己的脚本,您将需要绝对确定它是可缓存的,这意味着要注意缓存头。 Google 的 CDN 基本上不信任 HTTP/1.0 缓存(它将
Expires
标头设置为当前日期/时间),但信任 HTTP/1.1 缓存(绝大多数),因为它会发送max-age
标头(完全指定资源的一年)。我猜他们这样做是有原因的,你可以考虑效仿。由于您有时想要更改自己的脚本,因此您需要在它们上添加版本号,例如“my-nifty-script-1.js”,然后是“my-nifty-script-2.js”等。这样您就可以设置较长的
max-age
标头,但要知道,当您更新脚本时,您的用户将获得新的标头。 (这也适用于 CSS 文件。)不要使用查询字符串进行版本控制,请将数字实际放入资源名称中。由于您的 HTML 可能会定期更改,因此您可能希望其过期时间较短,但这当然完全取决于您的内容。
结论
这取决于情况。如果您不想将脚本与 jQuery UI 的本地副本结合起来,那么您最好使用 Google 的 jQuery UI。如果您愿意将它们结合起来,那么您会想要以任何一种方式进行现实世界的实验来做出自己的决定。完全有可能其他因素会消除这种影响,但这并不重要。如果您还没有,那么值得查看一下 Yahoo 和 Google 的网站速度建议页面:
Coming late to the party here, but allowing for gzip compression, you're basically comparing a download of ~51k from Google's CDN (the 197.14k content becomes 51.30k on-the-wire) vs. ~15.5k from your own servers (assuming your 60k file gzips at the same ratio as the full jQuery UI file does, and that you have gzip compression enabled). This takes us into a complex realm of:
And the answer to your question is a big: It depends, try each of them and measure the result in a real world scenario.
Pre-Existing Cache Copy
If a first-time visitor to your site has previously been to a site using jQuery UI from Google's CDN and it's still in their cache, that wins hands down. Full stop. No need to think about it any further. Google uses appropriate caching headers and the browser doesn't even have to send the request to the server, provided you link to a fully-specified version of jQuery UI (not one of the "any version of 1.8.x is fine" URLs — if you ask for jQuery UI 1.8.16, Google will return a resource that can be cached for up to a year, but if you ask for jQuery UI 1.8.x [e.g., any dot rev of 1.8], that resource is only good for an hour).
But let's suppose they haven't...
Latency and Transfer Time
Latency is how long it takes to set up the connection to the server, and transfer time is the time actually spent transferring the resource. Using my DSL connection (I'm not very close to my exchange, so I typically get about 4Mbit throughput on downloads; e.g., it's an okay connection, but nothing like what Londoners get, or those lucky FiOS people in the States), in repeated experiments downloading Google's copy of jQuery UI I typically spend ~50ms waiting for the connection (latency) and then 73ms doing data transfer (SSL would change that profile, but I'm assuming a non-SSL site here). Compare that with downloading Google's copy of jQuery itself (89.52k gzipped to 31.74k), which has the same ~50ms latency followed by ~45ms of downloading. Note how the download time is proportional to the size of the resource (31.74k / 51.30k = 0.61871345, and sure enough, 73ms x 0.61871345 = 45ms), but the latency is constant. So assuming your copy comes in at 15.5k, you could expect (for me) a 50ms latency plus about 22ms of actual downloading. All other things being equal, by hosting your own 60k copy vs. Google's 200k copy, you would save me a whopping 52ms. Let's just say that I wouldn't notice the difference.
All is not equal, however. Google's CDN is highly optimized, location-aware, and very fast indeed. For instance, let's compare downloading jQuery from Heroku.com. I chose them because they're smart people running a significant hosting business (currently using the AWS stack), and so you can expect they've at least spent some time optimizing their delivery of static content — and it happens they use a local copy of jQuery for their website; and they're in the U.S. (you'll see why in a moment). If I download jQuery from them (shockingly, they don't appear to have gzip enabled!), the latency is consistently in the 135ms range (with occasional outliers). That's consistently 2.7 times as much latency as to Google's CDN (and my throughput from them is slower, too, roughly half the speed; perhaps they only use AWS instances in the U.S., and since I'm in the UK I'm further away from them).
The point here being that latency may well wash out any benefit you get from the smaller file size.
Number of Requests
If you have any JavaScript files you're going to host locally, your users are still going to have to get those. Say you have 100k of your own script for your site. If you use Google's CDN, your users have to get 200k of jQuery UI from Google and 100k of your script from you. The browser may put those requests in parallel (barring your using
async
ordefer
on yourscript
tags, the browser has to execute the scripts in strict document order, but that doesn't mean it can't download them in parallel). Or it may well not.As we've established that for non-mobile users, at these sizes the actual data transfer time doesn't really matter that much, you may find that taking your local jQuery UI file and combining it with your own script, thus requiring only one download rather than two, may be more efficient even despite the Google CDN goodness.
This is the old "At most one HTML file, one CSS file, and one JavaScript file" rule. Minimizing HTTP requests is a Good ThingTM. Similarly, if you can use sprites rather than individual images for various things, that helps keep image requests down.
Proper Cache Headers
If you're hosting your own script, you'll want to be absolutely sure it's cacheable, which means paying attention to the cache headers. Google's CDN basically doesn't trust HTTP/1.0 caches (it sets the
Expires
header to the current date/time), but does trust HTTP/1.1 caches — the overwhelming majority — because it sends amax-age
header (of a year for fully-specified resources). I'm guessing they have a reason for that, you might consider following suit.Since you want to change your own scripts sometimes, you'll want to put a version number on them, e.g. "my-nifty-script-1.js" and then "my-nifty-script-2.js", etc. That's so you can set long
max-age
headers, but know that when you update your script, your users will get the new one. (This goes for CSS files, too.) Do not use the query string for the versioning, put the number actually in the resource name.Since your HTML presumably changes regularly, you probably want short expirations on that, but of course it totally depends on your content.
Conclusion
It depends. If you don't want to combine your script with your local copy of jQuery UI, you're probably better off using Google for jQuery UI. If you're happy to combine them, you'll want to do real-world experiments either way to make your own decision. It's entirely possible other factors will wash this out and it won't really matter. If you haven't already, it's worth reviewing Yahoo's and Google's website speed advice pages:
Google 的 jquery UI CDN 大小为 51 Kb:
https://ajax .googleapis.com/ajax/libs/jqueryui/1.8.16/jquery-ui.min.js
HTML5 Boilerplate 使用 jquery 加载的后备:
您可以将其应用于 jquery ui:
您加载 CDN 版本,然后检查 jquery ui 是否存在(您可以不保证任何 CDN 100% 的正常运行时间)。如果 jquery ui 不存在,则回退到本地。这样,如果他们的缓存中已经有它,那么您就可以开始了。如果他们不这样做,并且由于任何原因都无法检索 CDN,那么您最好使用本地 CDN。故障安全。
Google's CDN of jquery UI weighs in at 51 Kb:
https://ajax.googleapis.com/ajax/libs/jqueryui/1.8.16/jquery-ui.min.js
The HTML5 Boilerplate uses a fallback for jquery loading:
You can apply it to jquery ui:
You load the CDN version then check for the existence of jquery ui (you can't guarantee 100% up-time for any CDN). If jquery ui doesn't exist, fall back to your local. In this way, if they have it already in their cache, you are good to go. If they don't and the CDN can't be retrieved for any reason, your good to go with your local. Fail safe.
我认为大小比较没有抓住 CDN 的重点。通过从公共、常用的 CDN 提供 jQuery(或其他库)的副本,许多用户在到达您的站点之前将拥有该库的缓存副本。当他们这样做时,下载的有效大小为 0KB,而您的服务器的有效下载大小为 60KB。
Google 的 CDN 使用最广泛,因此如果引用它,您将有最大的机会获得缓存命中。
有关比较各种 CDN 的数字,请参阅 这篇文章。
就其价值而言,Google 的 jQuery 副本的缩小版本比您提到的大小小得多。
I think size comparisons miss the point of the CDN. By serving a copy of jQuery (or other library) from a public, commonly-used CDN, many users will have a cached copy of the library before they arrive at your site. When they do, the effective size of the download is 0KB compared to 60KB from your server.
Google's CDN is the most widely used, so you will have the best chance of a cache hit if you reference it.
For numbers comparing the various CDNs please see this article.
For what it's worth, the minified version of Google's jQuery copy is much smaller than the size you mentioned.
我想说重要的是你的服务器上的负载。对于用户来说,他们是从您的服务器还是从谷歌的服务器下载并不重要。如今,140kb 的带宽足够让用户很容易忽略。
现在真正的问题是您是否对 jQuery UI 进行了更改。如果是,那么您应该提供自己的副本。如果没有,那么服务谷歌就可以了。因为毕竟你的目标是减轻你这边的负担。
此外,缓存不仅发生在用户的浏览器上,还发生在他们正在访问的内容分发节点上。所以可以肯定地说,谷歌的副本几乎肯定被缓存了。
I would say what matter is the load you have on your server. For the user it doesn't really matter if they are downloading it from your server or from google's server. These days there is enough bandwidth for 140kb to be easy to ignore on the user's side.
Now the really question is if you made changes to jQuery UI. If yes then you should serve your own copy. If not, then it's ok to serve google's. Because after all what you are aiming to is to lower load on your side.
And besides the caching doesn't happen just on the user's browser, but also on content distribution nodes that they are accessing. So it's safe to say that google's copy is cached almost for sure.
由于大小如此之小,重要的是首次访问您网站的 http 请求数量。
例如,如果您的站点配置了脚本组合和缩小,因此第一次访问者的整个脚本要么是一个请求,要么包含在 html 本身中,那么使用本地副本会更好,因为即使是 JqueryUI 的缓存副本也不会比所有 JqueryUI 的缓存副本更快。立即显示站点的脚本(缓存的调用仍然需要出去并检查是否已修改)。
如果您没有良好的脚本组合和缩小设置(因此您要从您的站点或其他地方单独发送 jqueryui),请尽可能使用外部缓存。
With sizes this small, what matters is number of http requests for a first-time visitor to your site.
If for example your site has script combining and minification configured so the entire script for a first time visitor is either one request or included in html itself, using your local copy is better because even a cached copy of JqueryUI isn't faster than all the script for the site showing up at once (the cached call still has to go out and check for Modified).
If you don't have a good script combining and minification setup (so you were going to send jqueryui separately, either from your site or elsewhere), use outside caches wherever possible.