需要检查托管的大文件的正常运行时间

发布于 2024-08-26 07:29:28 字数 149 浏览 7 评论 0原文

我有一个动态生成的 rss feed,大小约为 150M(不要问)
问题是它会时不时地崩溃,并且如果不下载整个 feed 以获得 200 状态,就无法对其进行监控。 Pingdom 超时并返回“down”错误。

所以我的问题是,如何检查这个东西是否已启动并运行

I have a dynamically generated rss feed that is about 150M in size (don't ask)
The problem is that it keeps crapping out sporadically and there is no way to monitor it without downloading the entire feed to get a 200 status. Pingdom times out on it and returns a 'down' error.

So my question is, how do I check that this thing is up and running

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

长途伴 2024-09-02 07:29:29

您使用什么类型的 Web 服务器和服务器端编码平台(如果有)?是否有任何内容来自后端系统/数据库到 Web 层?

您确定问题不在于访问该文件的客户端代码吗?大多数客户端都会超时,并且通过互联网下载大文件可能会成为问题,具体取决于服务器的行为方式。这就是文件下载实用程序跟踪进度并分块下载的原因。

Web 服务器上的其他负载或用户数量也可能会影响服务器。如果您的可用内存和某些服务器很少,那么它可能无法向许多用户提供这么大的文件。您应该检查服务器如何发送文件并确保它正在将其分块。

我建议您执行 HEAD 请求来检查 URL 是否可访问以及服务器是否至少有响应。下一步可能是在托管文件的数据中心内部或非常接近的地方设置下载测试,以进一步监控。这可以降低成本并减少干扰。

What type of web server, and server side coding platform are you using (if any)? Is any of the content coming from a backend system/database to the web tier?

Are you sure the problem is not with the client code accessing the file? Most clients have timeouts and downloading large files over the internet can be a problem depending on how the server behaves. That is why file download utilities track progress and download in chunks.

It is also possible that other load on the web server or the number of users is impacting server. If you have little memory available and certain servers then it may not be able to server that size of file to many users. You should review how the server is sending the file and make sure it is chunking it up.

I would recommend that you do a HEAD request to check that the URL is accessible and that the server is responding at minimum. The next step might be to setup your download test inside or very close to the data center hosting the file to monitor further. This may reduce cost and is going to reduce interference.

肤浅与狂妄 2024-09-02 07:29:29

找到了一个可以满足我需要的在线工具
http://wasitup.com 使用头请求,因此在等待下载整个 150MB 文件时不会超时。< br>
感谢布莱恩利的帮助!

Found an online tool that does what I needed
http://wasitup.com uses head requests so it doesn't time out waiting to download the whole 150MB file.
Thanks for the help BrianLy!

全部不再 2024-09-02 07:29:29

看起来 pingdom 不支持 head 请求。我已经提出了功能请求,但谁知道呢。

我暂时将此功能破解到 mon 中(mon 是付钱给别人和自己监控并做所有事情)。我已经完全切换到 https,所以我修改了 https 监视器来做到这一点。他们的做法非常简单:复制 https.monitor 文件,将其命名为 https.head.monitor。在新的监视器文件中,我更改了以下行(您可能还想更新函数名称和调用位置):

get_https 更改为 head_https

Now in mon. cf 你可以调用一个 head 请求:

monitor https.head.monitor -u /path/to/file

Looks like pingdom does not support the head request. I've put in a feature request, but who knows.

I hacked this capability into mon for now (mon is a nice compromise between paying someone else to monitor and doing everything yourself). I have switched entirely to https so I modified the https monitor to do it. The did it the dead-simple way: copied the https.monitor file, called it https.head.monitor. In the new monitor file I changed the line that says (you might also want to update the function name and the place where that's called):

get_https to head_https

Now in mon.cf you can call a head request:

monitor https.head.monitor -u /path/to/file
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文