处理 SQL Server Analysis Services 中的大型结果集
我有一个数据库,其中包含有关文章、结构和制造商的数据。 这意味着一篇文章链接到 1 个制造商和 N 个结构节点(视为文章分类节点)。
目前,使用具有大量条件的 T-SQL 查询文章速度太慢,无法用于电子商店,即使具有良好的硬件和正确索引的表也是如此。 (应低于 1 秒)。 现在我想知道通过 OLAP 多维数据集访问这些数据是否有意义。 我已经开发了一个来获取聚合,例如:递归地在节点 Y 下存在制造商 X 的多少商品?
这些聚合非常快,现在我想知道这是否有意义还可以通过多维数据集检索整个文章结果集。 含义:给我递归存在于节点 Y 下面的制造商 X 的每一个商品 ID。 因为结果集可能非常大,所以查询需要更长的时间。
因此我的问题是,有没有办法在 SSAS 中处理大型结果集,或者这完全是我采取的错误方向?
I have a Database that contains data about articles,structures and manufacturers. Meaning an article is linked to 1 manufacturer and to N structure-nodes (think as article-classification-nodes).
Querying articles using T-SQL with a lot of conditions is currently too slow to be usable for an e-shop, even with good hardware and properly indexed tables. (Should be below 1 sec).
Now I wonder if it would make sense to access this data through an OLAP Cube. I already developed one to get aggregations, like: How many articles of manufacturer X exist below node Y recursively?
These aggregations are pretty fast, now I wonder if it makes sense to also retrieve whole article-result sets through Cubes. Meaning: Give me every single article ID of manufacturer X that exist below node Y recursively. Because the result sets can be quite large, the query takes even longer..
Therefore my question, is there a way to deal with large result sets in SSAS, or is this totally the wrong direction I am taking?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您绝对可以通过利用聚合来处理大量数据并使其在 SSAS 中表现良好。 当然,如果您要通过网络传输,仍然需要移动大量数据,因此请记住这一点。 您的查询将很快返回; 结果需要一段时间才能传输。
SSAS 的真正力量在于能够有针对性地制定您的方法。 我们可以从较高的级别开始,向下钻取,找到我们想要的级别,然后继续向下钻取,向下钻取,直到获得真正想要的数据,而不是说“给我一切”。
You can definitely deal with large sets of data and make them perform decently in SSAS by leveraging Aggregations. Of course, if you're going over the wire, that's still a lot of data to move, so keep that in mind. Your query will return quickly; the results will take a while to transfer.
The real power of SSAS is being able to be targeted in your approach. Instead of saying "Give me everything," we can start out at a high level, drill down, find the level we want, and continue drilling down, down, down until you get to the data that you really want.