javascript 表排序/分页(客户端)。多大才算太大?
我正在使用一个名为 Tablesorter 的 jQuery 插件 在我的一个应用程序中对日志表进行客户端排序。我还使用了 tablepager 插件。
我真的很喜欢客户端排序和分页给聚会带来的响应能力。我还喜欢您不必重复访问网络服务器或数据库。
不过,我可以看到,随着时间的推移,我显示的日志可能会变得相当大。我确信到了某个时候,客户端分页和排序将变得不切实际。这项技术会在什么时候开始因自身重量而崩溃? 500条记录? 2000条记录?一万条记录?
编辑: 简而言之,您将使用什么标准来确定是否要使用客户端排序/分页而不是服务器端分页?预期结果集的大小是否会影响您的决策?临界点在哪里?
I'm using a jQuery plugin called Tablesorter
to do client-side sorting of a log table in one of my applications. I am also making use of the tablepager add-in.
I really like the responsiveness that client-side sorting and paging brings to the party. I also like how you don't have to hit the web server or database repeatedly.
However I can see that, in time, the log I'm displaying could grow quite large. I'm sure there comes a point where client-side paging and sorting is going to be impractical. What point will this technique begin to collapse under it's own weight? 500 records? 2000 records? 10,000 records?
EDIT:
In nutshell, what criteria would you use to determine if you are going to use client-side sorting/paging as opposed to server-side paging? Does the size of expected result set factor into your decision? Where is the tipping point?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
当浏览器或客户端主机无法接受时,这种技术可能会崩溃。
使用服务器端分页来防止这种情况。
我首先会考虑发送给客户端的数据量,这反过来又会导致加载时间因素。
假设表的每一行都是 200 字节,并且我要向客户端发送 10000 行(允许客户端排序和分页),那么我要发送 200 * 10000 = 2,000,000 字节,即 2 MB。这将花费浏览器相当长的时间从服务器加载它,然后排序插件需要一些时间来对所有内容进行排序,然后分页一些时间来对其进行分页。
事实上,由于需要将所有行发送到客户端,服务器负载将会增加。
通常情况下,由于 Javascript 需要处理大量数据和迭代,浏览器(Firefox 或类似浏览器)会锁定并看起来像是崩溃了。
如果使用服务器端排序+分页,客户端看到的是准确且最新的信息。还假设您有相同的 10000 行,每行 200 字节。每页有 20 行。您只发送 20 * 200 = 4000 字节,即 4 KB,相对较小,可以由浏览器/服务器处理。
This technique will probably collapse when the browser or client host can't take it.
Use server-side pagination to prevent this.
I would first consider the amount of data I am sending to the client, which in turn causes the Loading time factor.
Say if each row of the table is 200 bytes, and I am sending 10000 rows to the client (which allows client sorting and pagination), I am sending 200 * 10000 = 2,000,000 bytes, aka 2 MB. This will take the browser quite some time to load it from server, then some time for the sorting plugin to sort everything, then pagination some time to page it.
In fact, you server load will increase with the need to send ALL the rows to the client.
Normally with so much data and iteration for Javascript to handle, the browser (Firefox or similar) will lock up and look as though it is crashing.
If you use server side sorting + pagination, the client sees the accurate and up-to-date information. Also say you have the same 10000 rows, each 200 bytes. You have 20 rows per page. You are only sending 20 * 200 = 4000 bytes, which is 4 KB, relatively small and can be handled by the browser/server.
这实际上取决于很多不同的因素,例如表格大小、列数以及用户使用的浏览器和版本。在发现真正的问题之前,我通常可以对多达 1000 条记录进行排序。如果您开始接近这个数字,我肯定会开始考虑服务器端排序。使用 AJAX,服务器端排序可以非常高效并且具有良好的用户体验。
最好的方法是根据您的具体情况进行尝试并查看。浏览器虽然不是真正设计来处理大量数据的,但仍然可以处理它。用户体验会很糟糕,但它可以处理的记录数量相当高。
This really depends on a lot of different things like table size number of columns, and which browser and version the person is using. I routinely can sort up to 1000 records before seeing a real problem. If you start approaching this number, I would definitely start looking at server side sorting. With AJAX, server side sorting can be quite efficient and have a decent user experience.
The best way is to look at your particular situation is to try it out and see. Browsers although not really designed to handle really large amounts of data like that can still handle it. The user experience will be abysmal, but the number of records it can handle are quite high.
几百个可能就可以了,具体取决于列数。当您处理 10^3(千)量级的数据时,这肯定会崩溃。
这些是我在不同浏览器上的经验发现,但我通常使用的是强大的硬件。我会将您的数据集限制为数百个。
A few hundred is probably okay, depending on the number of columns. This will most certainly break down when you're dealing data on the order of 10^3 (thousands).
These have been my empirical findings across different browsers, but I was usually on beefy hardware. I would limit your data set to hundreds.