压缩二进制数据
在我的算法的工作步骤之一中,我有一大堆二进制数据,我想对其进行压缩。
您可以建议使用哪种算法(或者可能是标准类)来尽可能高效地压缩数据?
编辑:
数据首先表示为0
和1
的byte[n]
。然后我将每 8 个字节连接成 1 并得到 byte[n/8]
数组。
On one of the working steps of my algorithm I have a big array of binary data, that I want to compress.
Which algorithm (or may be, standard class) can you advise to use to compress the data as much efficient as possible?
EDIT:
The data firstly represented as byte[n]
of 0
and 1
. Then I join every 8 bytes into 1 and get byte[n/8]
array.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
GZipStream
或DeflateStream
非常标准在这种情况下使用的类。显然,根据您尝试压缩的二进制数据,您将获得更好或更差的压缩率。例如,如果您尝试使用这些算法压缩 jpeg 图像
您不能指望非常好的压缩比。另一方面,如果二进制数据表示文本,它将很好地压缩。
The
GZipStream
or theDeflateStream
are pretty standard classes to be used in such situations.Obviously depending on the binary data you are trying to compress you will have better or worse compression ratio. For example if you try to compress a jpeg image with those algorithms
you cannot expect very good compression ratio. If on the other hand the binary data represents text it will compress nicely.
我将添加 DotNetZip 和 SharpZLib . .NET“基本”库 (
GZipStream
/DeflateStream
) 是基于流的(因此它们压缩单个数据流。流不是文件,但我们可以说文件的内容可以作为流读取)。 DotNetZip 更类似于经典的PkZip/WinZip/WinRar
I'll add DotNetZip and SharpZLib . The .NET "base" libraries (
GZipStream
/DeflateStream
) are stream-based (so they compress a single stream of data. A stream is not a file, but let's say that the content of a file can be read as a stream). DotNetZip is more similar to the classicPkZip/WinZip/WinRar