快速无损编码Skbitmap图像
我正在尝试存储大型4096x3072 skbitmap
与无损压缩的图像尽快。我尝试使用png
使用skimage.frombitmap(bitmap).encode(skencodedimageformat.png,100)
将它们存储为png
。然后使用来自 this 问题和 this>示例代码,我是一种将它们存储为a tiff
>图像,这要快得多,但仍然不够快。该代码也必须在 Linux 上工作。这是我当前的代码:
public static class SKBitmapExtensions
{
public static void SaveToPng(this SKBitmap bitmap, string filename)
{
using (Stream s = File.OpenWrite(filename))
{
SKData d = SKImage.FromBitmap(bitmap).Encode(SKEncodedImageFormat.Png, 100);
d.SaveTo(s);
}
}
public static void SaveToTiff(this SKBitmap img, string filename)
{
using (var tifImg = Tiff.Open(filename, "w"))
{
// Set the tiff information
tifImg.SetField(TiffTag.IMAGEWIDTH, img.Width);
tifImg.SetField(TiffTag.IMAGELENGTH, img.Height);
tifImg.SetField(TiffTag.COMPRESSION, Compression.LZW);
tifImg.SetField(TiffTag.PHOTOMETRIC, Photometric.RGB);
tifImg.SetField(TiffTag.ROWSPERSTRIP, img.Height);
tifImg.SetField(TiffTag.BITSPERSAMPLE, 8);
tifImg.SetField(TiffTag.SAMPLESPERPIXEL, 4);
tifImg.SetField(TiffTag.XRESOLUTION, 1);
tifImg.SetField(TiffTag.YRESOLUTION, 1);
tifImg.SetField(TiffTag.PLANARCONFIG, PlanarConfig.CONTIG);
tifImg.SetField(TiffTag.EXTRASAMPLES, 1, new short[] { (short)ExtraSample.UNASSALPHA });
// Copy the data
byte[] bytes = img.Bytes;
// Swap red and blue
convertSamples(bytes, img.Width, img.Height);
// Write the image into the memory buffer
for (int i = 0; i < img.Height; i++)
tifImg.WriteScanline(bytes, i * img.RowBytes, i, 0);
}
}
private static void convertSamples(byte[] data, int width, int height)
{
int stride = data.Length / height;
const int samplesPerPixel = 4;
for (int y = 0; y < height; y++)
{
int offset = stride * y;
int strideEnd = offset + width * samplesPerPixel;
for (int i = offset; i < strideEnd; i += samplesPerPixel)
{
byte temp = data[i + 2];
data[i + 2] = data[i];
data[i] = temp;
}
}
}
}
以及测试代码:
SKBitmap bitmap = SKBitmap.Decode("test.jpg");
Stopwatch stopwatch = new();
stopwatch.Start();
int iterations = 20;
for (int i = 0; i < iterations; i++)
bitmap.SaveToTiff("encoded.tiff");
stopwatch.Stop();
Console.WriteLine($"Average Tiff encoding time for a {bitmap.Width}x{bitmap.Height} image = {stopwatch.ElapsedMilliseconds / iterations} ms");
stopwatch.Restart();
for (int i = 0; i < iterations; i++)
bitmap.SaveToPng("encoded.png");
stopwatch.Stop();
Console.WriteLine($"Average PNG encoding time for a {bitmap.Width}x{bitmap.Height} image = {stopwatch.ElapsedMilliseconds / iterations} ms");
结果我得到:
Average Tiff encoding time for a 4096x3072 image = 630 ms
Average PNG encoding time for a 4096x3072 image = 3092 ms
是否有任何更快的方法来存储这些图像?我可以想象,我可以避免在var bytes = img.bytes
上复制数据,但我不确定如何。 png
的编码文件大小为 10.3MB ,而对于tiff
,它是 26MB 现在。
I'm trying to store large 4096x3072 SKBitmap
images with lossless compression as fast as I can. I've tried storing them as PNG
using SKImage.FromBitmap(bitmap).Encode(SKEncodedImageFormat.Png, 100)
but this was really slow. Then using information from this question and this example code I made a method to store them as a Tiff
image, which was a lot faster but still not fast enough for my purposes. The code has to work on Linux as well. This is my current code:
public static class SKBitmapExtensions
{
public static void SaveToPng(this SKBitmap bitmap, string filename)
{
using (Stream s = File.OpenWrite(filename))
{
SKData d = SKImage.FromBitmap(bitmap).Encode(SKEncodedImageFormat.Png, 100);
d.SaveTo(s);
}
}
public static void SaveToTiff(this SKBitmap img, string filename)
{
using (var tifImg = Tiff.Open(filename, "w"))
{
// Set the tiff information
tifImg.SetField(TiffTag.IMAGEWIDTH, img.Width);
tifImg.SetField(TiffTag.IMAGELENGTH, img.Height);
tifImg.SetField(TiffTag.COMPRESSION, Compression.LZW);
tifImg.SetField(TiffTag.PHOTOMETRIC, Photometric.RGB);
tifImg.SetField(TiffTag.ROWSPERSTRIP, img.Height);
tifImg.SetField(TiffTag.BITSPERSAMPLE, 8);
tifImg.SetField(TiffTag.SAMPLESPERPIXEL, 4);
tifImg.SetField(TiffTag.XRESOLUTION, 1);
tifImg.SetField(TiffTag.YRESOLUTION, 1);
tifImg.SetField(TiffTag.PLANARCONFIG, PlanarConfig.CONTIG);
tifImg.SetField(TiffTag.EXTRASAMPLES, 1, new short[] { (short)ExtraSample.UNASSALPHA });
// Copy the data
byte[] bytes = img.Bytes;
// Swap red and blue
convertSamples(bytes, img.Width, img.Height);
// Write the image into the memory buffer
for (int i = 0; i < img.Height; i++)
tifImg.WriteScanline(bytes, i * img.RowBytes, i, 0);
}
}
private static void convertSamples(byte[] data, int width, int height)
{
int stride = data.Length / height;
const int samplesPerPixel = 4;
for (int y = 0; y < height; y++)
{
int offset = stride * y;
int strideEnd = offset + width * samplesPerPixel;
for (int i = offset; i < strideEnd; i += samplesPerPixel)
{
byte temp = data[i + 2];
data[i + 2] = data[i];
data[i] = temp;
}
}
}
}
And the test code:
SKBitmap bitmap = SKBitmap.Decode("test.jpg");
Stopwatch stopwatch = new();
stopwatch.Start();
int iterations = 20;
for (int i = 0; i < iterations; i++)
bitmap.SaveToTiff("encoded.tiff");
stopwatch.Stop();
Console.WriteLine(quot;Average Tiff encoding time for a {bitmap.Width}x{bitmap.Height} image = {stopwatch.ElapsedMilliseconds / iterations} ms");
stopwatch.Restart();
for (int i = 0; i < iterations; i++)
bitmap.SaveToPng("encoded.png");
stopwatch.Stop();
Console.WriteLine(quot;Average PNG encoding time for a {bitmap.Width}x{bitmap.Height} image = {stopwatch.ElapsedMilliseconds / iterations} ms");
As a result I get:
Average Tiff encoding time for a 4096x3072 image = 630 ms
Average PNG encoding time for a 4096x3072 image = 3092 ms
Is there any faster way to store these images? I can imagine that I can avoid copying the data at var bytes = img.Bytes
but I'm not sure how. The encoded file size for the PNG
is 10.3MB and for the Tiff
it is 26MB now.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
访问一些更快的编码选项:
如果您对制作最佳PNG不太感兴趣(从文件大小的角度来看),则可以通过以下示例
If you are not so interested in making the most optimal png (from a file size point of view) then you can get access to some faster encoding options through:
In the above example: