Sybase/SQLServer 数据库上的表统计信息过时会产生什么影响?

发布于 2024-07-14 08:03:55 字数 92 浏览 9 评论 0原文

例如,对于使用量约为 1000 万行且每月增长 100 万行的频繁使用的表,如果统计数据是 6-8 个月的,这对数据库的性能会有多大影响? 您应该多久刷新一次统计数据?

For example, for heavily used tables with volumes in the order of 10 million rows that grow by a million rows a month, if the stats are 6-8 months old how detrimental to the performance of the database is this going to be? How often should you be refreshing the stats?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

檐上三寸雪 2024-07-21 08:03:56

统计信息由查询规划器保存和使用,它们具有显着的影响。 我无法为您提供有关刷新频率的确切指南。 这将取决于刷新它们需要多少工作以及新统计数据对查询的影响有多大。 真正的答案是进行良好的测量并根据结果判断选项。 没有测量的修修补补就是掷骰子。

Statistics are kept and used by the query planner, and they have a noticeable impact. I can't give you exact guidelines on how often you should refresh them. That will depend on how much work it takes to refresh them and how much impact fresh stats have on your queries. The real answer for this is to take good measurements and judge options by the results. Tinkering without measurement is a throw of the dice.

趁微风不噪 2024-07-21 08:03:56

我们每晚都会刷新统计数据。 如果统计数据可以每晚刷新,那么等待周末就没有意义了——到周五,它们将比周一更糟糕……

问题是,如果需要太长时间怎么办?

对于存在该问题的数据库,我们每晚都会刷新某些表上的统计数据 - 因此有些表每晚都会完成,有些则不那么频繁。 (我们有一个数据库表,其中记录了哪些表何时执行,以及统计数据重新生成所需时间的历史记录,并相应地调整了时间表)

<块引用>

如果统计数据是 6-8 个月的,这会对数据库的性能造成多大的损害

每月增长 100 万行的表没有产生巨大的影响,我会感到非常惊讶。

如果它对 是你的实际状态吗?我希望这些表也需要进行碎片整理

We refresh stats every night. No sense waiting for the Weekend if the stats could be refreshed nightly - by Friday they will be worse than they were on Monday ...

Problem is what if it takes too long?

For databases which have that problem we refresh stats on certain tables each night - so some tables are done every night, some less often. (We have a database table of which tables to do when, and a history of how long the Stats took to regenerate, and tune the schedule accordingly)

if the stats are 6-8 months old how detrimental to the performance of the database is this going to be

I would be very surprised if it didn't make a huge difference on a table growing by 1 million rows-per-month

If that is your actual state I would expect that the tables need defragging too

风吹雨成花 2024-07-21 08:03:56

影响是可怕的。 您应该尽可能频繁地刷新它们,以便为优化器提供做出决策的最佳信息。 通过运行 optdiag 实用程序,您将能够了解统计数据的糟糕程度。 分析输出并再次运行几天或一周进行比较,可以让您确切地知道情况有多糟糕。 我建议您尽早删除并重新创建索引,并对相关表运行“更新索引统计信息”。 这应该足以帮助您完成任务。 我假设您能够分析 optdiag 的输出。

Implications are dire. You should be refreshing them as often as you can to give the optimizer the best information to make decisions. You will be able to find out how bad the statistics are by running the optdiag utility. Analysing the output and running again to compare over a few days or a week will let you know exactly how bad the situation is. I would recommend that at the earliest convieniance you drop and recreate the indexes and run 'update index statistics' on the table in question. This should be enough information to get you through. I am assuming that you are able to analyse the output of optdiag though.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文