客户端与服务器交互的最佳实践有哪些?
我正在构建一个工作网站,最重要的功能之一是显示数据的丰富内容网格。默认情况下,每页仅显示 20 个项目,但数据库中有约 200 个项目可以过滤、排序和搜索。
我们的销售和营销团队还请求“列出所有”功能,以便他们可以在一个位置显示所有数据并滚动而不是分页浏览数据。
整个系统在服务器端使用 ASP.Net MVC,在客户端使用 jQuery 和 Flexigrid 构建,并且使用 JSON 通过 AJAX 在两者之间交换数据。
我已经得到了相当可靠的实际数据传输部分。包含 20 个结果的页面整个请求需要 800 毫秒(通过 Flexigrid 将请求发布到服务器并获取响应)。更多的是需要一段时间的客户端处理。
我可以将一些客户端处理卸载到服务器。但这会使服务器端操作花费更长的时间,并使返回的文档大小变得更大。在有高速互联网连接的情况下这不是问题……但情况并非一定如此。
我的另一个选择是下载尽可能多的数据并将大部分数据处理转移到客户端。这将请求时间减少到基本为零(仅获取更改的元素而不是整个数据集)。它在具有快速 CPU 和大量 RAM 的机器上运行得很好,但情况也不一定如此。
由于至少有一个人将此标记为“不是一个真正的问题”,让我澄清一下......
- 我可以做些什么来缓解客户端处理时间问题,而不将太多的处理转移到服务器,最终得到一个数据传输时间问题?
- 在平衡客户端处理与服务器端处理时有哪些最佳实践?
- 是在服务器端犯错更好,还是在客户端犯错更好?
- 我可以使用哪些工具来更好地优化这些交换,以免事情继续出错?
I'm building a website for work, and one of the more important features is a rich content grid displaying data. By default, it only displays 20 items per page, but we have ~200 items in the database that can be filtered, sorted, and searched.
Our sales and marketing team has also requests a "list all" feature so they can display all of the data in one place and scroll through rather than page through the data.
This entire system is built using ASP.Net MVC on the server side, jQuery and Flexigrid on the client side, and uses JSON to exchange data between the two via AJAX.
I've gotten the actual data transfer part pretty solid. A page of 20 results takes 800ms for the entire request (POST a request to the server via Flexigrid and get the response). It's more the client-side processing that takes a while.
I could offload some of the client-side processing to the server. But this would make the server-side operation take longer and make the size of the document returned that much larger. Not a problem in situations with a high-speed Internet connection ... but that's not necessarily the case.
The other option I have is to download as much data as possible and shift the majority of the data processing to the client. This cuts the request time down to basically nil (only fetching changed elements rather than the entire data set). It will work pretty well on machines with fast CPUs and a lot of RAM, but that's not necessarily the case, either.
Since at least one person flagged this as "not a real question," let me clarify ...
- What can I possibly do to alleviate the client-side processing time issues without moving so much processing to the server that I end up with a data transfer time issue?
- What are some best practices when it comes to balancing client-side processing with server-side processing?
- Is it better to err on the side of the server or the client?
- What are some tools I can use to better optimize these exchanges so that things don't continue to go awry?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
您在客户端处理什么花了这么长时间?处理 JSON 对象(即使是非常大的对象)不应过于密集。
在客户端编写数据时进行大量 DOM 查找可能会减慢速度。减少 DOM 查找可以极大地提高性能。我相信平衡服务器和服务器的良好实践客户端处理是在服务器上出错。由于服务器在您的控制之下,您始终可以选择升级服务器。将大部分处理保留在服务器上也将使移动设备和旧计算机的处理变得更容易。
您应该使用 AJAX 和以增强用户体验的方式增强客户端功能。根据用户的请求加载和处理数据。通过仅加载他们请求的内容,您可以减少服务器和服务器上的负载。客户。
如果您还一遍又一遍地请求相同类型的数据,您可以查看服务器和服务器。客户端缓存。通过利用缓存,您可以减少请求时间和/或带宽。
What are you processing on the client side that is taking so long? Processing a JSON object (even a very large one) should not be too intensive.
A lot of DOM look ups when writing your data client side could slow things down. Reducing DOM lookups can greatly help performance. I believe good practice for balancing server & client side processing is to error on the server. Since the server is under your control you can always choose to upgrade your server. Keeping the majority of processing on the server will also make things easier for mobile devices and older computers.
You should utilize AJAX & client side capabilities in a way that enhances the user experience. Load and process data as it is requested by the users. By loading only what they request you can decrease the load on your server & client.
If you are also requesting the same sort of data over and over you can look in to both server & client side caching. By utilizing caching you can reduce request times and/or bandwidth.
事实证明,问题更多地出在客户端的 JavaScript 引擎上,而不是我正在使用的数据上。我花了一天的大部分时间对流程进行基准测试并对各种操作进行计时。
Chrome 中一切都运行得很快。在 Firefox 中,一切都运行得非常快(我认为没有 Chrome 快)。真正的性能落后者是 Internet Explorer。
当我加载整个数据集(所有 200 行)时,Flexigrid 尝试对表中的每个单元格进行一些后处理。正如您在屏幕截图中看到的,每行有 29 个单元格……所以我们总共检查了一堆格式化操作 5800 次。
我能够将一些更昂贵的操作(即创建 jQuery 对象)从较低级别的单元循环中取出,这样它们每行只运行一次,但最终我仍然遇到与 IE 相关的性能问题。
为了给您提供一些真实世界的基准,我将代码设置为在遇到某些事件之前显示总时间:
populate
addCellProp
在初始解析数据并迭代表中的每个单元格后触发done
在一切完成时触发finsihed处理 20 行数据(默认):
处理完整数据集(本机上有 179 行):
最昂贵的操作位于
addCellProp
和done
之间。我已经完成了代码并使其尽可能高效,但是当您运行数据集的多次迭代时,特别是在操作 DOM 时,您能做的就只有这么多了。我修改了 Flexigrid(尽管有很多建议不要这样做)以尽可能少地接触 DOM,这实际上大大加快了速度。当我开始这项研究时,IE9 需要 20 到 30 秒才能触发
done
事件。不幸的是,并非所有平台都是一样的,而且 IE 似乎并不是以这种方式处理显示内数据的最佳引擎。
更好的方法可能是在服务器端创建和操作 HTML 表,并在 IE 用户请求时将整个内容(标记和所有内容)发送到浏览器,而不是依赖 IE 从原始 JSON 对象创建标记。
As it turns out, the problem is more with the JavaScript engines on the client side than the data I'm working with. I've spent much of the day benchmarking the process and timing various operations.
Everything runs quickly in Chrome. Everything runs pretty fast (thought not as fast as Chrome) in Firefox. The real performance laggard is Internet Explorer.
When I load the entire data set - all 200 rows - Flexigrid attempts to do some post-processing on every cell in the table. As you can see in the screenshot, each row has 29 cells ... so we're looking through a bunch of formatting operations a total of 5800 times.
I was able to pull some of the more expensive operations (i.e. creating jQuery objects) out of the lower-level cell loop so they're only run once per row, but ultimately I'm still running into IE-related performance issues.
To give you some real-world benchmarks, I set the code to spit out the total time before it hits certain events:
populate
fires when the browser first sends off the XHR requestaddData
fires after the request has returned and before the JSON object is parsedaddCellProp
fires after the initial parsing of data and iterates through each cell in the tabledone
fires when everything is finsihedProcessing 20 rows of data (default):
Processing the full data set (179 rows on this machine):
The most expensive operation is between
addCellProp
anddone
. I've gone through and made the code as efficient as possible, but there's only so much you can do when you're running through that many iterations of a data set, particularly when manipulating the DOM.I've modified Flexigrid (despite many recommendations not to) to touch the DOM as little as possible and that's actually sped things up quite a bit. When I started this research, IE9 would take between 20 and 30 seconds to hit the
done
event.The unfortunate truth here is that not all platforms are created equal, and IE doesn't seem to be the best engine for working with data within the display in this fashion.
A better approach might be to create and manipulate the HTML table on the server side and send the entire thing (markup and all) to the browser when requested for IE users rather than depending on IE to create the markup from a raw JSON object.
我可以做些什么来缓解客户端处理时间问题,而不会将太多处理转移到服务器,从而导致数据传输时间问题?
数据来自数据库吗?如果是这样限制那里的数据。这就是 db 所擅长的。我使用 Flexigrid 并将所有分页排序和过滤保留在那里。数据库仅返回按请求排序和过滤的所需数据。服务器所要做的就是返回它,客户端所要做的就是渲染它。
在平衡客户端处理与服务器端处理时有哪些最佳实践?
尽可能保持客户端轻量
是在服务器端还是在客户端更好?
是的,服务器的功能更强大。
我可以使用哪些工具来更好地优化这些交换,以免事情继续出错?
IE 开发工具使用网络选项卡查看网络上的情况
What can I possibly do to alleviate the client-side processing time issues without moving so much processing to the server that I end up with a data transfer time issue?
Is the data coming out of a database? If so restrict the data there. This is what the db is good at. I use flexigrid and keep all the paging sorting and filtering there. The db only returns the required data sorted and filtered as requested. All the server has to do is return it and all the client has to do is render it.
What are some best practices when it comes to balancing client-side processing with server-side processing?
Keep client side light as possible
Is it better to err on the side of the server or the client?
Yes server has much more power
What are some tools I can use to better optimize these exchanges so that things don't continue to go awry?
IE developer tools use the network tab to see what's coming over the wire