最快的 iPhone Blit 例程?

发布于 2024-08-04 09:20:18 字数 459 浏览 5 评论 0原文

我有一个 UIView 子类,我需要将 UIImage 传输到该子类上。有多种方法可以给这只猫剥皮,具体取决于您喜欢使用哪个系列的 API,我对最快的感兴趣。是UIImagedrawAtPoint还是drawRect?或者也许是基于 C 的 CoreGraphics 例程,或者其他什么?如果改变我的源图像数据格式可以使位块传送更快的话,我毫不犹豫。

为了描述我的情况,我的应用程序有大约 10 到大约 200 个小型 UIView (64x64),其中的一个子集需要根据用户交互重新绘制。我当前的实现是在我的 UIView 子类的 drawRect 例程中调用 drawAtPoint。如果你能想出更好的方法来处理这种情况,我洗耳恭听(好吧,眼睛)。

I have a UIView subclass onto which I need to blit a UIImage. There are several ways to skin this cat depending on which series of APIs you prefer to use, and I'm interested in the fastest. Would it be UIImage's drawAtPoint or drawRect? Or perhaps the C-based CoreGraphics routines, or something else? I have no qualms about altering my source image data format if it'll make the blitting that much faster.

To describe my situation my app has anywhere from ~10 to ~200 small UIViews (64x64), a subset of which will need to be redrawn based on user interaction. My current implementation is a call to drawAtPoint inside my UIView subclass' drawRect routine. If you can think of a better way to handle this kind of scenario, I'm all ears (well, eyes).

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

谎言 2024-08-11 09:20:18

使用 OpenGL 视图可能是最快的。保留图像的年龄缓存(或者如果您知道更好的方法来确定何时可以从缓存中删除某些图像,请务必使用它)并在应用程序空闲时预加载尽可能多的图像。它应该非常快,几乎不涉及 Objective-C 调用(只是 -draw

Using an OpenGL view may be fastest of all. Keep an age cache of images (or if you know a better way to determine when certain images can be removed from the cache, by all means use that) and preload as many images as you can while the app is idle. It should be very quick, with almost no Objective-C calls involved (just -draw)

流心雨 2024-08-11 09:20:18

虽然根本不是“blit”,但考虑到问题的要求(许多具有各种状态变化的小图像),我能够保持不同的状态在它们自己单独的 UIImageView 实例中重绘,并且只是在给定状态变化的情况下显示/隐藏适当的实例。

While not a "blit" at all, given the requirements of the problem (many small images with various state changes) I was able to keep the different states to redraw in their own separate UIImageView instances, and just showed/hid the appropriate instance given the state change.

小忆控 2024-08-11 09:20:18

由于 CALayer 轻量且快速,我会尝试一下。

蒂埃里

Since CALayer is lightweight and fast I would get a try.

Thierry

旧情勿念 2024-08-11 09:20:18

您将找到的最快的 blit 实现在我的 AVAnimator 库中,它包含 CoreGraphics 缓冲区的 blit 的 ARM asm 实现,请查看源代码。您可以利用它的方式是创建一个图形上下文,整个屏幕的大小,然后将您的特定图像更改传输到这个单个图形上下文中,然后从中创建一个 UIImage 并将其设置为一个 UIImageView。这将涉及每次刷新 1 个 GPU 上传,因此它不取决于渲染到缓冲区中的图像数量。但是,您可能不需要那么低的级别。您应该首先尝试将每个 64x64 图像制作为 CALayer,然后使用与 64x64 层的确切大小相同的图像内容更新每个层。唯一棘手的事情是,如果每个原始图像来自 PNG 或 JPEG 文件,您将需要解压缩它们。您可以通过创建另一个像素缓冲区并将原始图像渲染到新的像素缓冲区中来完成此操作,这样所有 PNG 或 JPEG 解压缩都会在您开始设置 CALayer 内容之前完成。

The fastest blit implementation you are going to find is in my AVAnimator library, it contains an ARM asm implementation of a blit for a CoreGraphics buffer, have a look at the source. The way you could make use of it would be to create a single graphics context, the size of the whole screen, and then blit your specific image changes into this single graphics context, then create a UIImage from that and set it as the image of a UIImageView. That would involve 1 GPU upload per refresh, so it will not depend on how many images you render into the buffer. But, you will likely not need to go that low level. You should first try making each 64x64 image into a CALayer and then update each layer with the contents of an image that is the exact size of the layer 64x64. The only tricky thing is that you will want to decompress each of your original images if they come from PNG or JPEG files. You do that by creating another pixel buffer and rendering the original image into the new pixel buffer, that way all the PNG or JPEG decompression is done before you start setting CALayer contents.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文