巨大的网格显示在asp.net页面上

发布于 2024-09-15 18:54:41 字数 318 浏览 8 评论 0原文

我有一个大问题。我们有一个 asp.net 应用程序,其中包含此报告,截至目前显示大约 1000 行,并且可能会增加到 20,000 行。不要问我为什么,但是我们的客户不喜欢分页,也不喜欢过滤,他们喜欢在单个页面上看到所有内容。我们明显的问题是它给服务器带来的内存负载(也是客户端浏览器可能崩溃的因素)。

我的问题是:如果我仅为此报告提供一个自定义桌面应用程序,可以显示成千上万行(通过 Web 服务或远程处理),它会堵塞服务器吗?在服务器上,如果使用 asp.net 应用程序,IIS 的工作进程基本上会耗尽内存,但是如果我运行此桌面应用程序并在应用程序服务器上调用相同的数据库,这是否可以解决内存问题?

I have a major problem. We have a asp.net application that has this report that shows about 1000 rows as of right now and can grow up potentially up to 20,000. Dont ask me why, but out client does not like paging and does not like filtering, they like to see everything on a single page. Our obvious problem is the load its putting on the server, in terms of memory (also the factor that the client browser may crash as well).

My question is: If I provide a custom desktop application only for this report, that can display thousands and thousands of rows (through web services or remotting), would it clog up the server? On the server the worker process of the IIS basically eats up memory in case of a the asp.net application, but if I have this desktop app running seperating calling the same data base on the application server, would this solve the memory problem?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

江南烟雨〆相思醉 2024-09-22 18:54:41

尝试使用延迟加载网格,例如 jqGrid:查看此页面上的第三个链接 [虚拟滚动]:

http://www.trirand.net/demoaspnet.aspx

网格使用ajax 仅加载页面上特定滚动位置可见的数据。看不到 pger 控件。如果您必须将其作为 ASP.NET 页面,那就太好了。

否则,@jmein 提供下载的建议是一个很好的建议。只需使用适当大小的缓冲区将报告流式传输到响应流。

另外,请阅读 IEnumerableyield return 语句的使用,以最大程度地减少加载到内存中以在响应中流式传输的数据量。

Try using a lazy-loading grid such as the jqGrid: Look at the third link [virtual scrolling] on this page:

http://www.trirand.net/demoaspnet.aspx

The grid uses ajax to only load the data that is visible on the page for the particular scroll position. Not a pger control in sight. Nice if you have to have this as an ASP.NET page.

Otherwise, @jmein's suggestion to make it a download is a good one. Just stream the report to the Response stream, using an appropriately sized buffer.

Also, read up on the use of IEnumerable<T> and the yield return statement to minimize the amount of data that you are loading into memory for streaming in the response.

空袭的梦i 2024-09-22 18:54:41

您能否根据数据创建一个 Excel 文件,这样您就不必担心它了?

Could you create an excel file out of the data so that you dont have to worry about it?

过气美图社 2024-09-22 18:54:41

交付机制并不重要 - 您可以在 ASP.NET 或桌面应用程序中执行此操作,而不会消耗大量内存。

一般原则是您需要以的形式访问数据,而不是一次性将其全部加载到内存中。当您处理流数据时,您在任何给定时间仅处理结果的子集。当您移动到流中的下一条记录时,您就表示您已经完成了上一条记录,因此 .NET 运行时可以回收用于操作该记录的内存。

在 C# 中,这意味着使用 DataReader(通常通过 IDbCommand.ExecuteReader 获取)。直接写入 HttpResponse流可能如下所示(尽管您也可以对它们进行数据绑定):

using(IDataReader reader = dataAccessLayer.GetData()) {

    if (! reader.IsClosed) {

        // Send writes to the client immediately
        // reader.Read advances the reader to the next record in the 
        // result set and discards the current record
        while (reader.Read()) {

            // Do something with the record - this just writes the first 
            // column to the response stream.
            Response.Write(reader[0]);

            // Send the content to the client immediately, even if the content
            // is buffered. The only data in memory at any given time is the
            // row you're working on.
            Response.Flush();
        }
    }
}

The delivery mechanism doesn't matter - you can do this in asp.net or a desktop application without consuming a ridiculous amount of memory.

The general principle is that you need to access the data as a stream instead of loading it into memory all at once. When you handle streamed data, you only deal with a subset of your results at any given time. When you move to the next record in a stream, you're signaling that you're finished with the previous record, so the .NET runtime can reclaim the memory used manipulate it.

In C# this means using a DataReader (normally obtained via IDbCommand.ExecuteReader). A typical fragment that writes directly to the HttpResponse stream using a data reader might look like this (though you can databind to them as well):

using(IDataReader reader = dataAccessLayer.GetData()) {

    if (! reader.IsClosed) {

        // Send writes to the client immediately
        // reader.Read advances the reader to the next record in the 
        // result set and discards the current record
        while (reader.Read()) {

            // Do something with the record - this just writes the first 
            // column to the response stream.
            Response.Write(reader[0]);

            // Send the content to the client immediately, even if the content
            // is buffered. The only data in memory at any given time is the
            // row you're working on.
            Response.Flush();
        }
    }
}
你的往事 2024-09-22 18:54:41

如果您设置一个 Web 服务返回 20K 条记录,并且每条记录为 1K,那么这就是 20MB 的服务调用。我同意 Daniel 的观点,即 AJAX 延迟加载在这里是合适的,这样您可以一次获取较小的块。

If you set up a web service to return 20K records and each one is 1K, then that's a 20MB service call. I agree with Daniel that AJAX lazy-loading is in order here so that you fetch smaller chunks at a time.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文