infuxdb 2存储保留/最大。尺寸

发布于 2025-02-03 17:07:11 字数 165 浏览 5 评论 0原文

我正在使用influxdb 2.2存储&网关设备上的汇总数据。关于空间的环境非常有限。我不知道在哪个间隔以及摄入的数据有多大的时间内。保留并不是什么要求。我想要的是确保涌入数据库的生长不会比5GB大。

我知道我可以将限制性设置为保留率,但这并不是理想的解决方案。您是否看到实现这一目标的可能性?

I am using InfluxDB 2.2 to store & aggregate data on a gateway device. The environment is pretty limited regarding space. I do not know in which interval and how large the data is that get's ingested. Retention is not that much of a requirement. All I want is to make sure that the influx db does not grow larger than let's say 5GB.

I know that I could just set restrictive bounds to the retention but this does not feel like an ideal solution. Do you see any possibility to achieve this?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

逆光下的微笑 2025-02-10 17:07:11

似乎您更关心磁盘空间。如果是这样,您可以尝试一些解决方法:

  • 保留策略:这类似于其他NOSQL中的TTL,它可以帮助您自动删除过时的数据。您应该设定保留策略的时间真正取决于您正在经营的业务。您可以运行几天的实例,看看磁盘空间如何增长,然后更改保留政策。
  • 下采样:“倒数采样是在时间窗口中汇总高分辨率时间序列,然后将下分辨率聚合存储到新桶中的过程”。并非所有数据都需要始终检索。在大多数情况下,更新鲜的数据(即热数据),获取越常见。更重要的是,您可能只需要看到历史数据的全局,即少细细胞。例如,如果您要以二级粒度收集数据,则可以执行一个降采样任务,以仅保留一分钟甚至小时精度的指示器值的平均值。这将为您节省很多空间,同时不影响您的趋势视图。

请参阅更多详细信息在这里

Seems that you are more concerned about the disk space. If so, there are several workaround you could try:

  • Retention policy: this is similar to TTL in other NoSQL and it could help you to delete the obsolete data automatically. How long you should set the retention policy really depends on the business you are running. You could run the instance for a few days and see how the disk space is growing and then change your retention policy.
  • Downsampling: "Downsampling is the process of aggregating high-resolution time series within windows of time and then storing the lower resolution aggregation to a new bucket ". not all data need to retrieved at all times. Most of the time, the fresher the data (i.e. hot data), the more frequent it will be fetched. What's more, you might just need to see the big picture of historical data, i.e. less granular. For example, if you are collecting the data in second-level granularity, you could perform a downsampling task to only retain the mean of the indicator values at an minute or even hour precision instead. That will save you a lot of space while not affecting your trending view that much.

See more details here.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文