在 C# 中将十六进制转换为 ascii 时遇到问题

发布于 2024-10-31 14:28:20 字数 379 浏览 1 评论 0原文

以下代码有问题,我收到此错误...

值对于字符来说太大或太小...在

sb.Append(Convert.ToChar(Convert.ToUInt32(hs, 16)));

        for (int i = 0; i < hexString.Length - 1; i += 2)
        {
            String hs = hexString.Substring(i, i + 2);

            sb.Append(Convert.ToChar(Convert.ToUInt32(hs, 16)));

        }

行上对于 C# 新手有什么建议吗?

谢谢 :)

Having issues with the code below, I get this error ...

value was either too large or to small for a character... on the line

sb.Append(Convert.ToChar(Convert.ToUInt32(hs, 16)));

        for (int i = 0; i < hexString.Length - 1; i += 2)
        {
            String hs = hexString.Substring(i, i + 2);

            sb.Append(Convert.ToChar(Convert.ToUInt32(hs, 16)));

        }

Any advice new to C#?

thanks :)

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

时光病人 2024-11-07 14:28:20

似乎您的十六进制子字符串在转换为 UInt32 位基数 16(十六进制)时,对于您正在使用的字符集来说太大(超出范围)。

检查您对 Uint32 的转换,并确保它确实可以转换为 char,这意味着它是有效的。

Seems like your hex substring when converted to a UInt32 bit base 16 (hex), is too large (out of bounds) of the char set you're using.

Check your conversion to Uint32 and make sure that it can be converted to char indeed, meaning it's valid.

吃兔兔 2024-11-07 14:28:20

编辑

正如@Lasse指出的,Substring需要一个开始索引和一个长度,但看起来你正在尝试向它传递一个开始索引和一个停止索引,因为您将 i + 2 传递给每个调用。这意味着第一次迭代将创建一个双字符子字符串,第二次迭代将创建一个三字符子字符串,依此类推。只需将 2 传递给它即可:

String hs = hexString.Substring(i, 2);

这应该可以纠正实际问题。

虽然这不会破坏任何内容,但您应该意识到您所做的并不是转换为 ASCII。 ASCII 是一种特殊的字符编码,Convert.ToChar 将数字转换为其相应的 Unicode(尤其是 UTF-16)字符。只要您的值范围仅在 0127(十六进制的 007F)之间,那么您就可以由于 Unicode 格式与标准 ASCII 字符集共享字符,因此可以满足所有实际用途。但是,如果您的字符使用 ASCII 的扩展之一(例如,Latin-1,这在 Windows 上很常见),那么这些字符将匹配。

如果您的数据采用 ASCII 格式并且需要支持大于 127 的值,则可以将十六进制字符串转换为 byte[],然后将其传递给 ASCIIEncoding 类使用 ASCII 格式解析该数据:

byte[] data = new byte[hexString.Length / 2];

for(int i = 0; i < hexString.Length - 1; i += 2)
{
    data[i / 2] = byte.Parse(hexString.Substring(i, 2));
}

string output = System.Text.Encoding.ASCII.GetString(data);

EDIT

As @Lasse points out, Substring takes a start index and a length, but it looks like you're trying to pass it a start index and a stop index, since you're passing i + 2 to every call. This means that the first iteration will create a two-character substring, the second will be a three-character substring, and so on. Just pass 2 to it:

String hs = hexString.Substring(i, 2);

That should correct the actual problem.

While this isn't breaking anything, you should be aware that what you're doing is not converting to ASCII. ASCII is a particular character encoding, and Convert.ToChar converts a numeric to its corresponding Unicode (UTF-16, particularly) character. As long as your values range only from 0 to 127 (00 to 7F in hex), then you're fine for all practical purposes, since the Unicode formats share characters with the standard ASCII character set. If, however, your characters use one of the extensions onto ASCII (Latin-1, for example, which is common on Windows), then these characters will not match.

If your data is in ASCII format and you need to support values greater than 127, you can convert your hex string to a byte[], then pass that to the ASCIIEncoding class to parse that data using the ASCII format:

byte[] data = new byte[hexString.Length / 2];

for(int i = 0; i < hexString.Length - 1; i += 2)
{
    data[i / 2] = byte.Parse(hexString.Substring(i, 2));
}

string output = System.Text.Encoding.ASCII.GetString(data);
病女 2024-11-07 14:28:20
string assumedASCII = string.Format(@"\x{0:x4}", (int)hexString);
string assumedASCII = string.Format(@"\x{0:x4}", (int)hexString);
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文