在C中,为什么sizeof(char)是1,而“a”是?是一个整数吗?
我尝试了
printf("%d, %d\n", sizeof(char), sizeof('c'));
并得到了 1, 4 作为输出。如果字符大小为 1,为什么 'c'
给我的是 4?我猜这是因为它是一个整数。因此,当我执行 char ch = 'c';
时,在将其分配给 char 变量时,是否会在幕后发生从 4 字节值到 1 字节值的隐式转换?
I tried
printf("%d, %d\n", sizeof(char), sizeof('c'));
and got 1, 4 as output. If size of a character is one, why does 'c'
give me 4? I guess it's because it's an integer. So when I do char ch = 'c';
is there an implicit conversion happening, under the hood, from that 4 byte value to a 1 byte value when it's assigned to the char variable?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(5)
在 C 中,'a' 是一个整数常量 (!?!),因此 4 对于您的体系结构来说是正确的。它被隐式转换为 char 以进行赋值。根据定义,sizeof(char) 始终为 1。标准没有说明单位 1 是什么,但它通常是字节。
In C 'a' is an integer constant (!?!), so 4 is correct for your architecture. It is implicitly converted to char for the assignment. sizeof(char) is always 1 by definition. The standard doesn't say what units 1 is, but it is often bytes.
C 标准规定像“a”这样的字符文字是 int 类型,而不是 char 类型。因此(在您的平台上) sizeof == 4。请参阅此问题以便进行更充分的讨论。
Th C standard says that a character literal like 'a' is of type int, not type char. It therefore has (on your platform) sizeof == 4. See this question for a fuller discussion.
这是
sizeof
运算符的正常行为(请参阅Wikipedia):sizeof
返回数据类型的大小。对于char
,您将得到 1。sizeof
返回变量或表达式类型的大小。当字符文字输入为int
时,您将得到 4。It is the normal behavior of the
sizeof
operator (See Wikipedia):sizeof
returns the size of the datatype. Forchar
, you get 1.sizeof
returns the size of the type of the variable or expression. As a character literal is typed asint
, you get 4.ISO C11
6.4.4.4 字符常量
对此进行了介绍,尽管它与早期标准相比基本没有变化。在/10
段落中指出:This is covered in ISO C11
6.4.4.4 Character constants
though it's largely unchanged from earlier standards. That states, in paragraph/10
:根据 ANSI C 标准,在使用整数的上下文中,
char
会提升为int
,您在printf
中使用了整数格式说明符code> 因此有不同的值。一个 char 通常是 1 个字节,但这是基于运行时和编译器定义的实现。According to the ANSI C standards, a
char
gets promoted to anint
in the context where integers are used, you used a integer format specifier in theprintf
hence the different values. A char is usually 1 byte but that is implementation defined based on the runtime and compiler.