在会话中缓存搜索结果与保持大对象堆清洁

发布于 2024-11-08 18:52:08 字数 601 浏览 0 评论 0原文

好吧,我已经在 ASP.NET 项目上工作了一段时间,似乎我做出了一些糟糕的设计选择,随着项目所包含的数据不断变得越来越大,这些选择又回来困扰着我。

在阅读了 .NET 内存管理之后,我想我已经确定了一整套潜在的原因。由于我正在做的事情并不是特别特别,所以我想知道是否有一个标准模式来实现我想做但我缺少的事情。

所以我有一个(有点昂贵的查询),它产生 1 到 20000 个结果。在后续请求中,我们可能只是对结果集进行分页,因此我将此结果存储在会话中。会话是进程内的。我想知道:

  • a) 将结果 b) 存储在会话中 c) 进程中是否有意义?我想要(a)的速度。我不知道是否有比用户存储它更有效的方法(b),并且如果我使用更复杂的状态服务器 - 它不是会变得更慢(c)吗?或者这可能是解决方案,更快地处理这些大对象,而不是将最后的结果集保留在 RAM 中直到会话过期?

  • 如果有任何结果集> ~ 20000 行最终可能会弄乱 LOH,有没有通用的方法来解决这个问题?

我知道这个问题有点不明确。我刚刚意识到我的整体设计可能存在缺陷(可扩展性),并且我只是想估计缺陷到底有多大。我希望可以收集一些有关标准模式的提示,从而将其变成一个普遍有用的问题。

Ok so I've been working on an ASP.NET project for a while and it seems I've made some bad design choices that are coming back to haunt me as the project keeps on getting bigger and bigger in terms of contained data.

After reading up on .NET memory management, I think I've identified a whole set of potential reasons. Since the stuff I'm doing isn't particularly special, I'm wondering if there's a standard pattern to achieve what I want to do that I'm missing.

So I have a (somewhat expensive query) which yields something between 1 and 20000 results. On subsequent requests, we may just be paging through the result set, so I store this result in the session. Session is InProc. I'm wondering:

  • Does it make sense a) to store the result b) in a session c) in-process? I want the speed of (a). I don't know if there's a more efficient way than to store it by user (b) and if I use a more sophisticated state server - doesn't it rather get slower (c)? Or could that be the solution, disposing of those large objects more quickly instead of keeping the last resultset in RAM until the session expires?

  • If any result set > ~ 20000 rows ends up potentially messing up the LOH, is there a generic way to get around that?

I know this question is slightly underspecified. I just realized my overall design might be flawed (w.r.t. scalability), and I'm just trying to estimate just how flawed exactly. I hope that some hints about standard patterns might be collected that turn this into a generally useful question nevertheless.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

冷默言语 2024-11-15 18:52:09

为什么总是返回所有记录?我认为加快查询速度的最佳方法是仅返回用户所需的数据..因此仅返回适合页面的数据!

尝试在谷歌上搜索 ROW_NUMBER() (SQL Server) 或 LIMIT (mySQL)。

这里有2个好教程

1) ScottGu的博客

2) 15 秒教程

Why return always all records ?? I think the best way to speed up your query is to return only the data needed to user.. so only the data that fit in the page!

Try googling for ROW_NUMBER() (SQL Server) or LIMIT (mySQL).

Here are 2 goods tutorial

1) ScottGu's Blog

2) 15 Second Tutorial

辞取 2024-11-15 18:52:09

不知道您的查询是什么,但为什么您要从数据库中提取的行数多于一次向用户显示的行数呢?有了良好的索引,提取后续页面应该会非常快,然后只有在需要这些页面时才需要这样做。

另一种方法是仅保存 20000 个项目的结果集 ID。这样,如果您需要对它们进行分页,您可以通过主键快速提取各个行。

最后也许您应该考虑使用 Cache 对象来存储结果而不是 Session。这样,您就可以让 .NET 决定何时处置对象,并且它们不会导致会话膨胀。

Not knowing what your query is, but why would you pull more rows from your database than you need to show your user at one time? With good indexes, pulling up subsequent pages should be pretty quick and then you only need to do that if you need those pages.

An alternative is to just save the IDs of the resultset for the 20000 items. That way if you need to page through them, you can just pull up the individual rows quickly via a primary key.

Finally maybe you should consider using the Cache object to store your results rather than the Session. That way you let .NET decide when to dispose of objects and they don't result in a bloated Session.

今天小雨转甜 2024-11-15 18:52:09

您应该尽量避免将结果存储在会话中。如果用户在同一会话中使用多个浏览器选项卡(这种情况确实会发生),您的应用程序可能无法正常运行。

如果您确实使用会话,绝对不要使用 InProc 模式,因为随着用户的增长,进程将耗尽内存并最终回收,即使超时尚未过去,用户的会话也将丢失。

尝试使用数据库进行分页,正如 Keltex 提到的那样,仅提取您正在显示的数据。

You should try to avoid storing the results in the session. Likely your application won't work well if the user employs multiple browser tabs in the same session (it happens).

If you do use the session, definitely don't use InProc mode because as the users grow the process will eat up memory and eventually recycle and the users' sessions will be lost even if the timeout hasn't elapsed.

Try to page with the database as Keltex mentioned only pull the data that you're displaying.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文