有没有办法在 FlatBufferBuilder 中进行写入之前测试缓冲区是否可能溢出(超过 2GB)?

发布于 2025-01-18 21:46:13 字数 654 浏览 5 评论 0原文

我正在尝试使用flatbufferbuilder Java API将大量数据(> 100GB)序列化为一个尺寸的前缀flatbuffers。

是否有一种方法可以测试是否下一个编译器生成对象可能会给出assertionError与生长在2GB bytebuffer size> size limit Limim Limim Limim Limit中的对象。有一个int offset()方法,但它不解释对齐。

此外,由于所有字段都是私有的,而且只有受保护的成员是finish()方法,

任何人都知道是否有可能执行此书籍 -保持可能的缓冲区溢出? 知道我们不能事先知道每个对象的尺寸

我 ()。由于最小对齐是缓冲区中任何对象的最大尺寸,因此这将确保它不会超过1GB,总计少于2GB。我不确定它是否实际上可能起作用。这不需要对构建器API进行任何更改。

我试图做一个无限的循环来编写字节值,并以sastionError绑定到2GB缓冲区大小限制。

我考虑扩展flatbufferbuilder,但大多数成员都是私人的。

I am trying to use the FlatBufferBuilder Java API to serialize large amount of data (>100GB) into a sequence of size prefixed flatbuffers.

Is there a way to test if next write of a compiler generated Table object might give an AssertionError tied to growing above 2GB ByteBuffer size limit. There is an int offset() method but it doesn't account for alignment.

Furthermore, It's hard to extend the FlatBufferBuilder class since all the fields are private and only protected member is the finish() method

Does anyone know if its possible to perform this book-keeping to prevent possible buffer overflow ? I know we can't know size of every object beforehand but maybe a good hint can be taken from reusing some parts of this calculation

One possible simple solution could be to add a size cap of little bit less than 1GB possible using offset(). Since minimum alignment is the largest size of any object seen in the buffer, this would ensure it doesn't go above 1GB and total less than 2GB. I'm not sure if it might actually work. This doesn't require making any changes to the builder API.

I tried to do a infinite loop to write a byte value and it exits with an AssertionError tied to 2GB buffer size limit.

I looked at extending the FlatBufferBuilder but most of the members are private.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。
列表为空,暂无数据
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文