Azure流分析当前的聚合
我在Azure Stream Analytics中很新,但是每次新活动都到达Azure Stream Analytics工作,我需要从一天开始就推动BI(Live Dashboard)滚动总数。我已经创建了下一个SQL查询来计算此问题
SELECT
Factory_Id,
COUNT(0) as events_count,
MAX(event_create_time) as last_event_time,
SUM(event_value) as event_value_total
INTO
[powerbi]
FROM
[eventhub] TIMESTAMP BY event_create_time
WHERE DAY(event_create_time) = DAY(System.Timestamp) and MONTH(event_create_time) = MONTH(System.Timestamp) and YEAR(event_create_time) = YEAR(System.Timestamp)
GROUP BY Factory_Id, SlidingWindow(day,1)
,但这并没有给我带来理想的结果 - 我在过去的24小时(不仅在当天)获得了总计,并且有时记录了较大的last_event_time的events_count_count ymill and laste_event_time记录。问题是 - 我在做什么错,如何实现预期的结果?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
编辑以下评论: 这计算了最后24小时的结果,但是需要的是每天运行总和/计数(从00:00到现在)。请参阅下面的更新答案。
我想知道分析方法比这里的聚合更好。
您没有使用时间窗口,而是在输入中计算并发出每个事件的记录:
唯一的打ic是用于降落在同一时间邮票上的事件:
您将不会在该时间戳上获得单个记录:
我们可能想添加一个步骤如果您的仪表板问题是问题,则可以处理它。让我知道!
编辑以下评论
该新版本将在每日翻滚窗口上发出渐进效果。为此,每次获得新记录时,我们收集最后24小时。然后,我们从前一天开始卸下行,然后重新计算新的聚合物。要正确收集,我们首先需要确保每个时间戳只有1个记录。
让我知道情况如何。
EDIT following comment: This computes the results for the last 24h, but what's needed is the running sum/count to day (from 00:00 until now). See updated answer below.
I'm wondering if an analytics approach would work better than an aggregation here.
Instead of using a time window, you calculate and emit a record for each event in input:
The only hiccup is for events landing on the same time stamp:
You won't get a single record on that timestamp:
We may want to add a step to the query to deal with it if it's an issue for your dashboard. Let me know!
EDIT following comment
This new version will emit progressive results on a daily tumbling window. To do that, every time we get a new record, we collect the last 24h. Then we remove the rows from the previous day, and re-calculate the new aggregates. To collect properly, we first need to make sure we only have 1 record per timestamp.
Let me know how it goes.