跟踪网站/cdn 的下载完成情况
我有一个 Drupal 网站,用户单击一个链接即可启动从内容交付网络 (CDN) 下载文件。脚本正在跟踪单击链接开始下载过程的用户数量。我正在寻找有关如何跟踪成功完成下载过程的用户数量的建议。
I have a Drupal website where users are clicking on a link that initiates a file download from a content delivery network (CDN). A script is tracking the number of users who click the link to begin the download process. I'm looking for suggestions on how I might track the number of users who successfully complete the download process.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
如果您只需要完成下载的数量,只需从 CDN 获取原始日志并通过日志分析工具运行它们即可。大多数 CDN 都提供每日访问日志作为标准服务。较大的玩家可以按小时记录或更好。
最佳解决方案取决于您的 CDN,因此如果您还没有与他们联系,请与他们联系。然而,这就是我过去的做法。
对于生成的每个受保护的下载 URL,为发出请求的用户附加一个唯一的 ID。典型的 CDN 下载 URL 可能包含到期时间和哈希值以防止篡改。您需要首先检查您的 CDN,以确保您选择的变量名称不会与其 API 冲突。在我们的例子中,我们同意 ign_* 前缀(意思是忽略。)
之前:
之后:
示例(用户 1234 的下载链接):
现在,当您下载原始日志时,每个条目都可以通过解析与您的一个用户相关联查询字符串。从这里您可以执行所有操作,从计算已完成的下载数量到实施每个用户的下载报告。
在我们的例子中,我们每 15 分钟就有一次可用的日志,并且我自动化了获取和处理,以启用字节级的每用户下载配额。
如果您要自己处理日志,需要记住的一件事是将 HTTP 206 部分条目分组在一起。特别是如果您对“已完成下载的数量”感兴趣。
If you only need the number of completed downloads, just grab the raw logs from your CDN and run them through a log analysis tool. Most CDNs provide daily access logs as a standard service. The bigger players can do hourly logs or better.
The best solution will depend on your CDN, so talk to them if you haven't already. However, here's how I've done it in the past.
To each protected download URL generated, append a unique id for the user who made the request. A typical CDN download URL might contain an expiry time and a hash to prevent tampering. You'll want to check with your CDN first to make sure you pick a variable name that doesn't clash with their API. In our case we agreed on a prefix of ign_* (meaning ignore.)
Before:
After:
Example (download link for user 1234):
Now when you download your raw logs, each entry can be associated with one of your users simply by parsing the query string. From here you can do everything from counting the number of completed downloads, to implementing per-user download reports.
In our case, we had logs available every 15 minutes and I automated the fetching and processing to enable byte-level per-user download quotas.
One thing to keep in mind, if you're going to be processing the logs yourself, is to group HTTP 206 partial entries together. Particularly if you're interested in the "number of completed downloads."