数据库中有大量表的 SQL Server 性能

发布于 2024-08-12 01:06:03 字数 474 浏览 3 评论 0原文

我正在更新我们的一个网络应用程序中的一段遗留代码。该应用程序允许用户上传电子表格,我们将其作为后台作业进行处理。

这些用户上传的每一个都会创建一个新表来存储电子表格数据,因此我的 SQL Server 2000 数据库中的表数量将快速增长 - 短期内将增加数千个表。我担心这可能不是 SQL Server 优化的内容。

保持这个机制原样是最简单的,但我不想留下一颗稍后会爆炸的定时炸弹。如果需要修复,最好现在就修复它(明显的替代方案是一个大表,其中一个键将记录与用户批次相关联)。

随着表数量的增长,这种架构是否可能会产生性能问题?如果是这样,可以通过升级到更高版本的 SQL Server 来缓解该问题吗?

编辑:回答问题的更多信息:

  • 每个表都具有相同的架构。没有理由不能将其实现为一张大表;事实并非如此。
  • 删除旧表也是一种选择。它们可能需要一两个月,但不会超过这个时间。

I am updating a piece of legacy code in one of our web apps. The app allows the user to upload a spreadsheet, which we will process as a background job.

Each of these user uploads creates a new table to store the spreadsheet data, so the number of tables in my SQL Server 2000 database will grow quickly - thousands of tables in the near term. I'm worried that this might not be something that SQL Server is optimized for.

It would be easiest to leave this mechanism as-is, but I don't want to leave a time-bomb that is going to blow up later. Better to fix it now if it needs fixing (the obvious alternative is one large table with a key associating records with user batches).

Is this architecture likely to create a performance problem as the number of tables grows? And if so, could the problem be mitigated by upgrading to a later version of SQL Server ?

Edit: Some more information in response to questions:

  • Each of these tables has the same schema. There is no reason that it couldn't have been implemented as one large table; it just wasn't.
  • Deleting old tables is also an option. They might be needed for a month or two, no longer than that.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

独留℉清风醉 2024-08-19 01:06:03

对于引擎来说,拥有许多表并不是问题。目录元数据针对非常大的尺寸进行了优化。让每个用户拥有自己的表还有一些优点,例如每个表具有单独的安全 ACL、每个用户内容的单独表统计信息以及尤其是提高“意外”表扫描的查询性能。

但问题是维护。如果您保留此功能,则必须绝对设置自动维护任务,不能将此作为管理员的手动任务。

Having many tables is not an issue for the engine. The catalog metadata is optimized for very large sizes. There are also some advantages on having each user own its table, like ability to have separate security ACLs per table, separate table statistics for each user content and not least improve query performance for the 'accidental' table scan.

What is a problem though is maintenance. If you leave this in place you must absolutely set up task for automated maintenance, you cannot let this as a manual task for your admins.

携余温的黄昏 2024-08-19 01:06:03

我想这绝对是一个以后会很痛苦的问题。为什么每次都需要创建一个新表?除非有充分的理由这样做,否则我不会这样做。

最好的方法是简单地创建一个 ID,并将所有上传的数据与一个 ID 关联起来,所有数据都在同一个表中。这需要您做一些工作,但启动起来更安全、更易于管理。

I think this is definitely a problem that will be a pain later. Why would you need to create a new table every time? Unless there is a really good reason to do so, I would not do it.

The best way would be to simply create an ID and associate all uploaded data with an ID, all in the same table. This will require some work on your part, but it's much safer and more manageable to boot.

晚风撩人 2024-08-19 01:06:03

拥有所有这些表对于任何数据库来说都不是理想的选择。上传后,网络应用程序是否使用新创建的表?也许它会向用户提供有关上传内容的一些反馈?

您的应用程序是否使用所有这些表格进行任何报告等?你提到将它们保留几个月 - 不知道为什么。如果没有,则将内容移动到中央表并删除单个表。

处理好后端后,重新编码网站以将上传内容保存到中央表中。您可能需要两张桌子。 UploadHeader 表用于跟踪上传批次:谁上传、何时上传等,并链接到包含 Excel 上传中的各个记录的详细信息表。

Having all of these tables isn't ideal for any database. After the upload, does the web app use the newly created table? Maybe it gives some feedback to the user on what was uploaded?

Does your application utilize all of these tables for any reporting etc? You mentioned keeping them around for a few months - not sure why. If not move the contents to a central table and drop the individual table.

Once the backend is taken care of, recode the website to save uploads to a central table. You may need two tables. An UploadHeader table to track the upload batch: who uploaded, when, etc. and link to a detail table with the individual records from the excel upload.

不必了 2024-08-19 01:06:03

我建议您将这些数据存储在一个表中。在服务器端,您可以创建一个控制台,用户/操作员可以从中手动启动释放表条目的任务。您可以向他们询问不再需要数据的日期范围,这些数据将从数据库中删除。

您可以先行一步,设置数据库触发器以在指定时间段后擦除条目/记录。您可以再次添加用户/操作员/管理员可以设置这些数据有效性限制的 UI

因此,您可以创建系统,以便在管理员可以再次设置的指定时间后自动删除垃圾数据,以及为他们提供一个控制台,他们可以使用该控制台手动删除其他不需要的数据。

I will suggest you to store these data in a single table. At the server side you can create a console from where user/operator could manually start the task of freeing up the table entries. You can ask them for range of dates whose data is no longer needed and the same will be deleted from the db.

You can take a step ahead and set a database trigger to wipe the entries/records after a specified time period. You can again add the UI from where the User/Operator/Admin could set these data validity limit

Thus you could create the system such that the junk data will be auto deleted after specified time which could again be set by the Admin, as well as provide them with a console using which they can manually delete additional unwanted data.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文