SQL Server 2000 临时表与表变量
存储一些临时数据(一个中 50k 行,另一个中 50k 行)来执行计算会更有效。 我将每晚执行一次此过程。
在比较这样的事情时,如何检查效率?
What would be more efficient in storing some temp data (50k rows in one and 50k in another) to perform come calculation. I'll be doing this process once, nightly.
How do you check the efficiency when comparing something like this?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
结果会因磁盘 (#temp) 或内存 (@temp) 中哪一个更容易存储数据而异。
以下参考文献中的一些摘录
关于 50k 行的具体情况:
参考:
The results will vary on which will be easier to store the data, in disk (#temp) or in memory (@temp).
A few excerpts from the references below
Regarding your specific case with 50k rows:
References:
使用表变量和临时表之间可能存在很大的性能差异。 在大多数情况下,临时表比表变量更快。 我从私人 SQL Server MVP 新闻组获取了以下提示,并获得了 Microsoft 的许可与您分享。 一位 MVP 注意到,尽管使用表变量的查询不会在大型 SMP 设备上生成并行查询计划,但使用临时表(本地或全局)并在相同环境下运行的类似查询确实会生成并行计划。
更多来自SQL Mag(不幸的是,需要订阅,我会尝试并立即查找更多资源)
编辑:以下是来自 代码项目
There can be a big performance difference between using table variables and temporary tables. In most cases, temporary tables are faster than table variables. I took the following tip from the private SQL Server MVP newsgroup and received permission from Microsoft to share it with you. One MVP noticed that although queries using table variables didn't generate parallel query plans on a large SMP box, similar queries using temporary tables (local or global) and running under the same circumstances did generate parallel plans.
More from SQL Mag (subscription required unfortunately, I'll try and find more resources momentarily)
EDIT: Here is some more in depth information from CodeProject