SQL查询时间总计计算
这是数据集的一个示例:。
我正在寻找一个SQL查询,该查询计算示例数据定义的曲线的积分。
由于曲线的值是布尔值(0-1),因此使用秒数计算积分也将导致泵运行时的时间值。
最初的间隔为00:00:00.000,最后23:59:59.000。
我想我必须从上到下滚动数据,检查下一个或上一个值,评估和执行时间差异,但我不知道该怎么做。
在我的示例中:从午夜到2点钟,值为0,从2点到4点钟,直到泵上的泵就在04:00:06-02:00:00:07 = 7,199S,然后在从15点到17点,所以17:00:04-15:00:04 = 7200S,然后从22:00:03到23:59:59开始,因为没有0个值。<<<<<。 br> 基本上,我应该添加值在1处的值,直到下一个0或23:59:59。
一天中,我可能在不同的时间进行10000多个采样,不一定是每小时。
光标可以滚动浏览数据,如果下一个是= 1,则可以滚动[date(date)-date(Cursor)],如果下一个是= 0不计算。
This is an example of dataset: .
I am looking for an SQL query that calculates the integral of the curve defined by the example data.
Since the value of the curve is a boolean (0-1), calculating the integral using seconds would also result in the value of time in seconds that the pump ran.
The initial interval would be 00:00:00.000 and the final 23:59:59.000.
I guess I have to scroll the data from top to bottom, check the next or previous value, evaluate and do the time differences, but I have no idea how to do it.
In my example: from midnight until 2 o'clock the value is 0, from 2 o'clock until 4 o'clock the pump is on so 04:00:06-02:00:07=7,199s, then it is on from 15 o'clock until 17 o'clock so 17:00:04-15:00:04=7200s, and then it is on from 22:00:03 until 23:59:59 because there is no 0 value.
Basically I should add up the number of seconds where the value is at 1 until the next 0 or 23:59:59.
In a day I might have more than 10000 samplings at different times, not necessarily hourly.
The cursor could scroll through the data, if next is =1 then seconds[date(next)-date(cursor)], if next is =0 do not calculate.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您可以使用
lag()
获取先前的示例。然后计算曲线下的面积应该很微不足道。例如:
曲线下的总面积可能是:
You can use
LAG()
to get the previous sample. Then computing the area under the curve should be trivial.For example:
The total area under the curve could be: