哪种是进行分页的最佳方法,以使服务器上的负载最小

发布于 2024-09-27 13:02:40 字数 541 浏览 0 评论 0原文

我对分页做了一些研究,从我读到的内容来看,有两种相互矛盾的解决方案

  1. 每次用户单击“下一步”时,都会从数据库加载一小部分数据 问题 - 假设有一百万行满足任意 WHERE 条件。这意味着一百万行被检索、存储、文件排序,然后其中大部分被丢弃,只有 20 行被检索。如果用户单击“下一步”按钮,则相同的过程再次发生,仅检索不同的 20 个。(参考 - http://www.mysqlperformanceblog.com/2008/09/24/four-ways-to- Optimize-pagination-displays/)

  2. 从数据库加载所有数据并缓存它...这里也提到了一些问题 - http://www.javalobby.org/java/ forums/t63849.html

所以我知道我将不得不使用两者的混合......但是问题归结为 - 哪种操作更昂贵 - 在数据库中重复查询小块数据 或者 通过网络传输大型结果集

I have done a bit of research on pagination and from what i have read there are 2 contradictory solutions of doing it

  1. Load a small set of data from the database each time a user clicks next
    Problem - Suppose there are a million rows that meet any WHERE conditions. That means a million rows are retrieved, stored, filesorted, then most of them are discarded and only 20 retrieved. If the user clicks the "next" button the same process happens again, only a different 20 are retrieved.(ref - http://www.mysqlperformanceblog.com/2008/09/24/four-ways-to-optimize-paginated-displays/)

  2. Load all the data form the database and cache it...This has few problems too mentioned here - http://www.javalobby.org/java/forums/t63849.html

So i know i will have to use a hybrid of both..however the question boils down to - Which operation is more expensive -
making repeated queries in database for small chunks of data
or
transferring a large result set over the network

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

許願樹丅啲祈禱 2024-10-04 13:02:40

我的公司正是遇到这种情况,所以我们选择了一种混合的方式。我们的数据是表格形式的,因此我们通过 AJAX 将其发送到 datatables 这允许良好的 UI 格式化、排序、过滤和显示/隐藏列。 Datatables 有一个很好的解决方案,它会“提前排队”,称为“管道”,它将在用户操作之前获取一定数量的数据(在我们的例子中,最多是他们请求的记录的 5 倍),然后在没有请求的情况下进行分页,直到数据用完为止的数据。使用 Datatables 实现起来非常容易,但我怀疑如果您必须使用 jQuery 的 AJAX 功能手动编写它,类似的解决方案并不困难。

我尝试对 150 万条记录的数据库进行完全加载和缓存,但结果很糟糕。客户差点就甩了我,因为他们很生气,速度太慢了。经过一夜的 AJAX 良好体验后,客户再次感到高兴。但最好永远不要达到这一点。

祝你好运。

My company has exactly this situation, and we've chosen a bit of a hybrid. Our data is tabular, so we send it via AJAX to datatables This allows for good UI formatting, sorting, filtering, and show/hide of columns. Datatables has a great solution that will "queue ahead" called "pipelining" that will grab a quantity of data ahead of the user's action (in our case, up to 5 times the records they request) then page through without requests until it runs out of data. It's EXTREMELY easy to implement with Datatables, but I suspect a similar solution would not be difficult if you had to write it by hand using jQuery's AJAX functionality.

I tried doing a full load and cache on a 1.5 million record database and it was a trainwreck. The client almost dumped me because they got mad it was so slow. After a solid overnight of AJAX goodness, the client was happy once again. But best never to get to that point.

Good Luck.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文