Firefox 是否会同步同一页面的请求?
我使用的是火狐浏览器 3.6.6。
我有一个名为 index.php
的 PHP 脚本,其中包含以下代码:
<?php
sleep(20); die(time());
?>
我打开两个浏览器选项卡,将 URL 复制到每个选项卡中,然后在每个选项卡中快速按 Enter 键。第一个选项卡只需 20 多秒即可完成。第二个选项卡在 40 多秒内完成。
我在 IE 中做了同样的实验,两个脚本都在一秒内完成,大约 20 秒。
这是预期的行为吗?导致我测试这个的实际脚本是一个同步过程。我希望任何尝试执行它两次的人都会收到一个错误,表明该过程已经在进行中,而不是让浏览器坐在那里等待,直到它可以第二次执行它。
如果这就是 Firefox 的工作方式,那么它如何确定页面何时是重复的以及应该对请求进行排队而不是同时运行它们?
我可以通过在末尾放置一个垃圾 GET 字符串来欺骗它,例如 index.php
和 index.php?JUNK=1
都在大约 20 秒内完成。
I'm using Firefox 3.6.6.
I have a PHP script called index.php
with the following code:
<?php
sleep(20); die(time());
?>
I open two browser tabs, copy the URL into each of them, and then quickly hit enter in each tab. The first tab completes in just over 20 seconds. The second tab completes in just over 40 seconds.
I do the same experiment in IE and both scripts complete within a second of each other, around 20 seconds.
Is this expected behavior? The actual script that caused me to test this is a synchronous procedure. I want any person attempting to execute it twice to receive an error that the process is already in progress, rather than having the browser sit there and wait until it can execute it a second time.
If this is how Firefox works, how does it determine when a page is a duplicate and that it should queue up the requests rather than run them simultaneously?
I can fool it by putting a junk GET string on the end, e.g. index.php
and index.php?JUNK=1
both complete at around 20 seconds.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
这是缓存机制:如果禁用所有缓存,则不会出现问题。显然,FF 开发人员会检查当请求仍在加载时它可能是可缓存的,因此它会等待第一个请求,然后才决定不使用缓存,然后才分派第二个请求(您可以在时间戳中检查)您的日志文件)。
您可以尝试使用一些无缓存标头和将数据刷新()到客户端,也许它可以确定在接收标头时缓存不是一个选项,尽管我怀疑缓存检查仅在内容加载后完成。
It's the caching mechanism: if you disable all caching the problem does not occur. Apparently, the FF developers have a check that when a request is still loading it might be cachable, so it waits for the first, only then decides not to use cache, and only then dispatches the second request (you can check that in the timestamp of your log files).
You could try to play around with some no-cache headers & a flush() of data to the client, perhaps it can determine caching is not an option when receiving headers, although I suspect the cache check is done only after the content is loaded.