忽略 OpenGL typedef 有什么影响?
所以,我使用的是 OpenGL,其 typedefs unsigned int
-> GLuint
。
由于某种原因,在我的程序中使用 GLuint 而不是更通用的无符号整数或 uint32_t 感觉是错误的。
关于忽略 typedef 的消极/积极方面有什么想法吗?
So, I am using OpenGL which typedefs unsigned integer
-> GLuint
.
For some reason it feels wrong to sprinkle my program with GLuint, instead of the more generic unsigned integer or uint32_t.
Any thoughts on negative/positive aspects of ignoring the typedefs?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
typedef 的作用是让你的代码更加可移植。如果您想迁移到一个
GLuint
可能具有不同底层类型(无论出于何种原因)的平台,那么使用typedef
是明智的选择。The typedefs are there to make your code more portable. If you ever wanted to move to a platform in which a
GLuint
may have a different underlying type (For whatever reason), it would be wise to use thetypedef
.您的代码总是有可能被移植到
GLuint != unsigned int
的平台。如果您要忽略 typedef,那么至少添加一些编译时检查,如果它们与预期不同,则会导致编译错误。There is always the chance that your code gets ported to a platform where
GLuint != unsigned int
. If you are going to ignore the typedefs, then at least add some compile time checks that result in a compilation error if they are different than what is expected.一般来说,请参阅 K-ballo 和 Chad La Guardia 的上述答案,这就是此类 typedef 背后的意图。在某些情况下,为了隐藏实际的数据类型,以防 API 在未来的修订中发生变化(OpenGL 不太可能发生这种情况,但我已经看到这种情况发生了)。如果数据类型发生变化,则需要重新编译,但无需更改代码。
不过,不得不说,库开发人员经常在可移植性的这一特定方面做得过度,甚至到了愚蠢的地步。
在这种特殊情况下,OpenGL 规范非常清楚什么是 GLuint(第 2.4 章)。它是一个至少 32 位长度的无符号整数。它们没有留下太多解释或改变的空间。
到目前为止,它不可能是
uint32_t
之外的任何东西(因为这就是uint32_t
的定义),并且有如果您愿意的话,没有充分的理由不能使用uint32_t
来代替它(除了使用 GLuint 明确表明变量应该与 OpenGL 一起使用,但是嗯)。当然,原则上它仍然可能与
unsigned int
不同,因为关于int
的精确大小没有太多说明(其他比sizeof(long) >= sizeof(int) >= sizeof(short)
)。In general, see the above answers by K-ballo and Chad La Guardia, that's the intent behind such typedefs. That, and in some cases to hide the actual datatype in case the API changes in a future revision (not likely going to happen with OpenGL, but I've seen it happen). In case the datatype changes, this requires a recompilation, but no code changes.
Still, one has to say that library developers often overdo this particular aspect of portabilty to the point of sillyness.
In this particular case, the OpenGL specification is very clear about what a
GLuint
is (chapter 2.4). It is an unsigned integer of at least 32 bits length. They don't leave much room for interpretation or change.Insofar, there is no chance it could ever be anything other than an
uint32_t
(as that is the very definition ofuint32_t
), and there is no good reason why you couldn't useuint32_t
in its stead if you prefer to do so (other than using GLuint makes explicit that a variable is meant to be used with OpenGL, but meh).It might in principle still be something different than an
unsigned int
of course, since not much is said about the precise size of anint
(other thansizeof(long) >= sizeof(int) >= sizeof(short)
).