忽略 OpenGL typedef 有什么影响?

发布于 2024-12-09 11:10:02 字数 185 浏览 0 评论 0原文

所以,我使用的是 OpenGL,其 typedefs unsigned int -> GLuint

由于某种原因,在我的程序中使用 GLuint 而不是更通用的无符号整数或 uint32_t 感觉是错误的。

关于忽略 typedef 的消极/积极方面有什么想法吗?

So, I am using OpenGL which typedefs unsigned integer -> GLuint.

For some reason it feels wrong to sprinkle my program with GLuint, instead of the more generic unsigned integer or uint32_t.

Any thoughts on negative/positive aspects of ignoring the typedefs?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

挽你眉间 2024-12-16 11:10:02

typedef 的作用是让你的代码更加可移植。如果您想迁移到一个 GLuint 可能具有不同底层类型(无论出于何种原因)的平台,那么使用 typedef 是明智的选择。

The typedefs are there to make your code more portable. If you ever wanted to move to a platform in which a GLuint may have a different underlying type (For whatever reason), it would be wise to use the typedef.

在梵高的星空下 2024-12-16 11:10:02

您的代码总是有可能被移植到 GLuint != unsigned int 的平台。如果您要忽略 typedef,那么至少添加一些编译时检查,如果它们与预期不同,则会导致编译错误。

There is always the chance that your code gets ported to a platform where GLuint != unsigned int. If you are going to ignore the typedefs, then at least add some compile time checks that result in a compilation error if they are different than what is expected.

ヅ她的身影、若隐若现 2024-12-16 11:10:02

一般来说,请参阅 K-ballo 和 Chad La Guardia 的上述答案,这就是此类 typedef 背后的意图。在某些情况下,为了隐藏实际的数据类型,以防 API 在未来的修订中发生变化(OpenGL 不太可能发生这种情况,但我已经看到这种情况发生了)。如果数据类型发生变化,则需要重新编译,但无需更改代码。
不过,不得不说,库开发人员经常在可移植性的这一特定方面做得过度,甚至到了愚蠢的地步。

在这种特殊情况下,OpenGL 规范非常清楚什么是 GLuint(第 2.4 章)。它是一个至少 32 位长度的无符号整数。它们没有留下太多解释或改变的空间。

到目前为止,它不可能是 uint32_t 之外的任何东西(因为这就是 uint32_t 的定义),并且有如果您愿意的话,没有充分的理由不能使用 uint32_t 来代替它(除了使用 GLuint 明确表明变量应该与 OpenGL 一起使用,但是嗯)。
当然,原则上它仍然可能与unsigned int不同,因为关于int的精确大小没有太多说明(其他比 sizeof(long) >= sizeof(int) >= sizeof(short))。

In general, see the above answers by K-ballo and Chad La Guardia, that's the intent behind such typedefs. That, and in some cases to hide the actual datatype in case the API changes in a future revision (not likely going to happen with OpenGL, but I've seen it happen). In case the datatype changes, this requires a recompilation, but no code changes.
Still, one has to say that library developers often overdo this particular aspect of portabilty to the point of sillyness.

In this particular case, the OpenGL specification is very clear about what a GLuint is (chapter 2.4). It is an unsigned integer of at least 32 bits length. They don't leave much room for interpretation or change.

Insofar, there is no chance it could ever be anything other than an uint32_t (as that is the very definition of uint32_t), and there is no good reason why you couldn't use uint32_t in its stead if you prefer to do so (other than using GLuint makes explicit that a variable is meant to be used with OpenGL, but meh).
It might in principle still be something different than an unsigned int of course, since not much is said about the precise size of an int (other than sizeof(long) >= sizeof(int) >= sizeof(short)).

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文