存储大量传感器数据记录

发布于 2024-10-13 03:23:27 字数 338 浏览 2 评论 0原文

我需要创建一个数据库来保存传感器数据,稍后将查询这些数据以生成报告(显示给定时间范围内的图表和平均/最大/最小值)。

数据点如下所示:

CREATE TABLE [dbo].[Table_1](
    [time] [datetime] NOT NULL,
    [sensor] [int] NOT NULL,
    [value] [decimal](18, 0) NULL
)

可以以秒到分钟的间隔添加数据(取决于传感器)。

当几年的数据积累时,我是否应该担心我的数据库变得太大(数据库将在 MS SQL Server 2008 工作组版本上运行)?

I need to create a database that saves sensor data that will be queried to generate reports later on (Display a graph and AVG/MAX/MIN values for a given timeframe).

The data points look like this:

CREATE TABLE [dbo].[Table_1](
    [time] [datetime] NOT NULL,
    [sensor] [int] NOT NULL,
    [value] [decimal](18, 0) NULL
)

Data can be added in intervals ranging from seconds to minutes (depending on the sensor).

Should I worry about my Database growing too big when several years of data accumulate (The DB will run on a MS SQL Server 2008 workgroup edition)?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

谈情不如逗狗 2024-10-20 03:23:28

如果您担心数据库变得太大,那么我建议您可以拥有一个具有相同结构的Archive_Table,并将数据存档的时间间隔例如每月一次或6个月(完全基于数据量)。

因此,这将允许您检查表中的记录数量。当然,当您需要时,存档表可用于生成报告。

If you are worried about the database to grow too big then I would suggest you can have a Archive_Table with the same structure and archive data for an interval like once a month or 6 months(entirely based on the volume of data).

So, this would allow you to have a check on the number of records in your Table. And, of course the archive tables would be available for report generation when you need it.

攒眉千度 2024-10-20 03:23:27

有专门的历史数据库,例如 OSISoft 的 PI Historian,可以比关系数据库更好地处理此类数据。使用 PI,您可以为每个数据点配置压缩偏差,这样数据将不会被存档,除非数据发生至少该压缩偏差的变化。当您查询给定点的历史数据时,您可以要求 PI 对指定时间的值进行插值,即使您的时间段位于存档值之间。

它的功能还有很多,但您必须自己探索,因为我不打算成为 OSISoft 推销员。然而,这绝对是您想要存储大量传感器数据的方式。

There are specialized historian databases, such as OSISoft's PI Historian that handle this type of data a lot better than a relational database. With PI you can configure a compression deviation for each data point, such that the data will not be archived unless it changes by at least that compression deviation. When you query for the historical data for a given point, you can ask PI to do interpolation of what the value would have been at the specified time even though your time period is between the archived values.

It's capable of a whole lot more, but you will have to explore that on your own because I don't intend on becoming an OSISoft salesman. However, this is definitely the way you want to go for storing large quantities of sensor data.

憧憬巴黎街头的黎明 2024-10-20 03:23:27

这完全取决于您想在上面花费什么资源和精力。在每秒 1 行的情况下,该表每年每个传感器的存储空间仍然小于 0.5GB,这非常小。如果您有数千个传感器,那么您可能需要考虑是否创建汇总表来帮助报告和分析数据。

像这样的传感器数据通常非常重复。有更方便的方法来存储重复值 - 例如,存储具有一定时间范围的一行,而不是存储具有不同时间的多行。

有许多软件包可以帮助存储和管理此类时间序列数据。还有大量关于该主题的研究和文献,可能会对您有所帮助。如果您还不熟悉它,请搜索“Process Historian”、“复杂事件处理”和“SCADA”等术语。

It all depends what resources and effort you want to expend on it. At 1 row per second that table would still be less than 0.5GB per sensor per year, which is very small. If you have thousands of sensors then you might want to consider whether to create summary tables to help with the reporting and analysis of the data.

Sensor data like this is often very repetetive. There are more convenient ways to store repeated values - for example by storing one row with a range of times rather than multiple rows with different times.

There are many software packages that can help with storing and managing this kind of time series data. There is also a significant body of research and literature on the subject, which might help you. If you aren't already familiar with it then Google for terms like "Process Historian", "Complex Event Processing" and "SCADA".

国粹 2024-10-20 03:23:27

这取决于您将如何使用数据、另外添加哪些索引、多少个传感器等。

如图所示,该表可以存储 1.5 亿行(~ 1 个传感器 x 每秒 1 次记录 x 5 年)大约 6GB 的空间(假设是一个堆)。 文件大小限制为 16 TB,我不知道工作组版本对此有任何限制。

It depends on how you're going to use the data, what indexes you add in addition, how many sensors, etc.

That table, as shown, could store 150 million rows (~ 1 sensor x 1 recording per second x 5 years) in ~6GB of space (assuming a heap). The file size limit is 16 terabytes, and I'm not aware of any restrictions on this for Workgroup edition.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文