在 iPhone 上绘制屏幕缓冲区的最快方法

发布于 2024-08-23 22:47:37 字数 4356 浏览 8 评论 0原文

我有一个“软件渲染器”,正在从 PC 移植到 iPhone。在 iPhone 上使用像素缓冲区手动更新屏幕的最快方法是什么?例如,在 Windows 中,我发现最快的函数是 SetDIBitsToDevice。

我对 iPhone 或库了解不多,而且似乎有很多层和不同类型的 UI 元素,所以我可能需要很多解释......

现在我只是不断更新opengl 中的纹理并将其渲染到屏幕上,我非常怀疑这将是最好的方法。

更新:

我尝试过 openGL 屏幕大小的纹理方法:

我得到了 17fps...

我使用了 512x512 纹理(因为它需要是 2 的幂)

,只是调用

glTexSubImage2D(GL_TEXTURE_2D,0,0,0,512,512,GL_RGBA,GL_UNSIGNED_BYTE, baseWindowGUI->GetBuffer());

似乎几乎是所有速度减慢的原因。

注释掉它,并保留我所有的软件渲染 GUI 代码,以及现在未更新纹理的渲染,导致 60fps、30% 渲染器使用率,并且 CPU 没有明显的峰值。

请注意,GetBuffer() 只是返回一个指向 GUI 系统的软件后台缓冲区的指针,无论如何都不会重新调整缓冲区或调整缓冲区的大小,它的大小和格式适合纹理,因此我相当确定速度减慢了与软件渲染器无关,这是个好消息,看起来如果我能找到一种方法在 60 处更新屏幕,软件渲染器暂时应该可以工作。

我尝试使用 512,320 而不是 512,512 进行更新纹理调用,这奇怪地甚至更慢......以 10fps 运行,而且它还表示渲染利用率只有 5%,并且所有时间都浪费在 openGLES 内对 Untwiddle32bpp 的调用中。

我可以将我的软件渲染更改为本地渲染为任何像素格式,如果这会导致更直接的位块传输。

仅供参考,在 2.2.1 ipod touch G2 上测试(就像类固醇上的 Iphone 3G)

更新 2:

我刚刚写完 CoreAnimation/Graphics 方法,它看起来不错,但我有点担心它如何更新屏幕每一帧,基本上抛弃了旧的 CGImage,创建一个全新的...在下面的“someRandomFunction”中查看: 这是更新图像的最快方法吗?任何帮助将不胜感激。

//
//  catestAppDelegate.m
//  catest
//
//  Created by User on 3/14/10.
//  Copyright __MyCompanyName__ 2010. All rights reserved.
//




#import "catestAppDelegate.h"
#import "catestViewController.h"
#import "QuartzCore/QuartzCore.h"

const void* GetBytePointer(void* info)
{
    // this is currently only called once
    return info; // info is a pointer to the buffer
}

void ReleaseBytePointer(void*info, const void* pointer)
{
    // don't care, just using the one static buffer at the moment
}


size_t GetBytesAtPosition(void* info, void* buffer, off_t position, size_t count)
{
    // I don't think this ever gets called
    memcpy(buffer, ((char*)info) + position, count);
    return count;
}

CGDataProviderDirectCallbacks providerCallbacks =
{ 0, GetBytePointer, ReleaseBytePointer, GetBytesAtPosition, 0 };


static CGImageRef cgIm;

static CGDataProviderRef dataProvider;
unsigned char* imageData;
 const size_t imageDataSize = 320 * 480 * 4;
NSTimer *animationTimer;
NSTimeInterval animationInterval= 1.0f/60.0f;


@implementation catestAppDelegate

@synthesize window;
@synthesize viewController;


- (void)applicationDidFinishLaunching:(UIApplication *)application {    


    [window makeKeyAndVisible];


    const size_t byteRowSize = 320 * 4;
    imageData = malloc(imageDataSize);

    for(int i=0;i<imageDataSize/4;i++)
            ((unsigned int*)imageData)[i] = 0xFFFF00FF; // just set it to some random init color, currently yellow


    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    dataProvider =
    CGDataProviderCreateDirect(imageData, imageDataSize,
                               &providerCallbacks);  // currently global

    cgIm = CGImageCreate
    (320, 480,
     8, 32, 320*4, colorSpace,
     kCGImageAlphaNone | kCGBitmapByteOrder32Little,
     dataProvider, 0, false, kCGRenderingIntentDefault);  // also global, probably doesn't need to be

    self.window.layer.contents = cgIm; // set the UIWindow's CALayer's contents to the image, yay works!

   // CGImageRelease(cgIm);  // we should do this at some stage...
   // CGDataProviderRelease(dataProvider);

    animationTimer = [NSTimer scheduledTimerWithTimeInterval:animationInterval target:self selector:@selector(someRandomFunction) userInfo:nil repeats:YES];
    // set up a timer in the attempt to update the image

}
float col = 0;

-(void)someRandomFunction
{
    // update the original buffer
    for(int i=0;i<imageDataSize;i++)
        imageData[i] = (unsigned char)(int)col;

    col+=256.0f/60.0f;

    // and currently the only way I know how to apply that buffer update to the screen is to
    // create a new image and bind it to the layer...???
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    cgIm = CGImageCreate
    (320, 480,
     8, 32, 320*4, colorSpace,
     kCGImageAlphaNone | kCGBitmapByteOrder32Little,
     dataProvider, 0, false, kCGRenderingIntentDefault);

    CGColorSpaceRelease(colorSpace);

    self.window.layer.contents = cgIm;

    // and that currently works, updating the screen, but i don't know how well it runs...
}


- (void)dealloc {
    [viewController release];
    [window release];
    [super dealloc];
}


@end

I have a "software renderer" that I am porting from PC to the iPhone. what is the fastest way to manually update the screen with a buffer of pixels on the iphone? for instance in windows the fastest function I have found is SetDIBitsToDevice.

I don't know much about the iphone, or the libraries, and there seem to be so many layers and different types of UI elements, so I might need a lot of explanation...

for now I'm just going to constantly update a texture in opengl and render that to the screen, I very much doubt that this is going to be the best way to do it.

UPDATE:

I have tried the openGL screen sized texture method:

I got 17fps...

I used a 512x512 texture (because it needs to be a power of two)

just the call of

glTexSubImage2D(GL_TEXTURE_2D,0,0,0,512,512,GL_RGBA,GL_UNSIGNED_BYTE, baseWindowGUI->GetBuffer());

seemed pretty much responsible for ALL the slow down.

commenting it out, and leaving in all my software rendering GUI code, and the rendering of the now non updating texture, resulted in 60fps, 30% renderer usage, and no notable spikes from the cpu.

note that GetBuffer() simply returns a pointer to the software backbuffer of the GUI system, there is no re-gigging or resizing of the buffer in anyway, it is properly sized and formatted for the texture, so I am fairly certain the slowdown has nothing to do with the software renderer, which is the good news, it looks like if I can find a way to update the screen at 60, software rendering should work for the time being.

I tried doing the update texture call with 512,320 rather than 512,512 this was oddly even slower... running at 10fps, also it says the render utilization is only like 5%, and all the time is being wasted in a call to Untwiddle32bpp inside openGLES.

I can change my software render to natively render to any pixle format, if it would result in a more direct blit.

fyi, tested on a 2.2.1 ipod touch G2 (so like an Iphone 3G on steroids)

UPDATE 2:

I have just finished writting the CoreAnimation/Graphics method, it looks good, but I am a little worried about how it updates the screen each frame, basically ditching the old CGImage, creating a brand new one... check it out in 'someRandomFunction' below:
is this the quickest way to update the image? any help would be greatly appreciated.

//
//  catestAppDelegate.m
//  catest
//
//  Created by User on 3/14/10.
//  Copyright __MyCompanyName__ 2010. All rights reserved.
//




#import "catestAppDelegate.h"
#import "catestViewController.h"
#import "QuartzCore/QuartzCore.h"

const void* GetBytePointer(void* info)
{
    // this is currently only called once
    return info; // info is a pointer to the buffer
}

void ReleaseBytePointer(void*info, const void* pointer)
{
    // don't care, just using the one static buffer at the moment
}


size_t GetBytesAtPosition(void* info, void* buffer, off_t position, size_t count)
{
    // I don't think this ever gets called
    memcpy(buffer, ((char*)info) + position, count);
    return count;
}

CGDataProviderDirectCallbacks providerCallbacks =
{ 0, GetBytePointer, ReleaseBytePointer, GetBytesAtPosition, 0 };


static CGImageRef cgIm;

static CGDataProviderRef dataProvider;
unsigned char* imageData;
 const size_t imageDataSize = 320 * 480 * 4;
NSTimer *animationTimer;
NSTimeInterval animationInterval= 1.0f/60.0f;


@implementation catestAppDelegate

@synthesize window;
@synthesize viewController;


- (void)applicationDidFinishLaunching:(UIApplication *)application {    


    [window makeKeyAndVisible];


    const size_t byteRowSize = 320 * 4;
    imageData = malloc(imageDataSize);

    for(int i=0;i<imageDataSize/4;i++)
            ((unsigned int*)imageData)[i] = 0xFFFF00FF; // just set it to some random init color, currently yellow


    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    dataProvider =
    CGDataProviderCreateDirect(imageData, imageDataSize,
                               &providerCallbacks);  // currently global

    cgIm = CGImageCreate
    (320, 480,
     8, 32, 320*4, colorSpace,
     kCGImageAlphaNone | kCGBitmapByteOrder32Little,
     dataProvider, 0, false, kCGRenderingIntentDefault);  // also global, probably doesn't need to be

    self.window.layer.contents = cgIm; // set the UIWindow's CALayer's contents to the image, yay works!

   // CGImageRelease(cgIm);  // we should do this at some stage...
   // CGDataProviderRelease(dataProvider);

    animationTimer = [NSTimer scheduledTimerWithTimeInterval:animationInterval target:self selector:@selector(someRandomFunction) userInfo:nil repeats:YES];
    // set up a timer in the attempt to update the image

}
float col = 0;

-(void)someRandomFunction
{
    // update the original buffer
    for(int i=0;i<imageDataSize;i++)
        imageData[i] = (unsigned char)(int)col;

    col+=256.0f/60.0f;

    // and currently the only way I know how to apply that buffer update to the screen is to
    // create a new image and bind it to the layer...???
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    cgIm = CGImageCreate
    (320, 480,
     8, 32, 320*4, colorSpace,
     kCGImageAlphaNone | kCGBitmapByteOrder32Little,
     dataProvider, 0, false, kCGRenderingIntentDefault);

    CGColorSpaceRelease(colorSpace);

    self.window.layer.contents = cgIm;

    // and that currently works, updating the screen, but i don't know how well it runs...
}


- (void)dealloc {
    [viewController release];
    [window release];
    [super dealloc];
}


@end

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(5

溺渁∝ 2024-08-30 22:47:37

App Store 认可的仅使用 CPU 进行 2D 图形处理的最快方法是使用 CGDataProviderCreateDirect 创建由缓冲区支持的 CGImage,并将其分配给 CALayer > 的 contents 属性。

为了获得最佳结果,请使用 kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Little 或 kCGImageAlphaNone | kCGBitmapByteOrder32Little kCGBitmapByteOrder32Little 位图类型和双缓冲区,以便显示永远不会处于不一致状态。

编辑:理论上这应该比绘制到 OpenGL 纹理更快,但一如既往,配置文件是确定的。

edit2:无论您使用哪种合成方法,CADisplayLink都是一个有用的类。

The fastest App Store approved way to do CPU-only 2D graphics is to create a CGImage backed by a buffer using CGDataProviderCreateDirect and assign that to a CALayer's contents property.

For best results use the kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Little or kCGImageAlphaNone | kCGBitmapByteOrder32Little bitmap types and double buffer so that the display is never in an inconsistent state.

edit: this should be faster than drawing to an OpenGL texture in theory, but as always, profile to be sure.

edit2: CADisplayLink is a useful class no matter which compositing method you use.

弃爱 2024-08-30 22:47:37

最快的方法是使用 IOFrameBuffer/IOSurface,它们是私有框架。

所以OpenGL似乎是AppStore应用程序唯一可能的方式。

The fastest way is to use IOFrameBuffer/IOSurface, which are private frameworks.

So OpenGL seems to be the only possible way for AppStore apps.

祁梦 2024-08-30 22:47:37

只是为了以答案的形式发表我对 @rpetrich 答案的评论,我会说在我的测试中我发现 OpenGL 是最快的方法。我已经实现了一个名为 EEPixelViewer 的简单对象(UIView 子类),它的功能足够通用,足以适用于我认为大多数人。

它使用 OpenGL 将各种格式(24bpp RGB、32 位 RGBA 和多种 YpCbCr 格式)的像素尽可能高效地推送到屏幕。该解决方案在几乎所有 iOS 设备(包括较旧的设备)上实现了大多数像素格式的 60fps。用法非常简单,不需要 OpenGL 知识:

pixelViewer.pixelFormat = kCVPixelFormatType_32RGBA;
pixelViewer.sourceImageSize = CGSizeMake(1024, 768);
EEPixelViewerPlane plane;
plane.width = 1024;
plane.height = 768;
plane.data = pixelBuffer;
plane.rowBytes = plane.width * 4;
[pixelViewer displayPixelBufferPlanes: &plane count: 1 withCompletion:nil];

对每个帧重复调用 displayPixelBufferPlanes(使用 glTexImage2D 将像素缓冲区加载到 GPU),这就是它的全部内容。该代码很聪明,因为它尝试使用 GPU 进行所需的任何类型的简单处理,例如排列颜色通道、将 YpCbCr 转换为 RGB 等。

还有相当多的逻辑用于使用 UIView 的 来实现缩放contentMode 属性,因此 UIViewContentModeScaleToFit/Fill 等都按预期工作。

Just to post my comment to @rpetrich's answer in the form of an answer, I will say in my tests I found OpenGL to be the fastest way. I've implemented a simple object (UIView subclass) called EEPixelViewer that does this generically enough that it should work for most people I think.

It uses OpenGL to push pixels in a wide variety of formats (24bpp RGB, 32-bit RGBA, and several YpCbCr formats) to the screen as efficiently as possible. The solution achieves 60fps for most pixel formats on almost every single iOS device, including older ones. Usage is super simple and requires no OpenGL knowledge:

pixelViewer.pixelFormat = kCVPixelFormatType_32RGBA;
pixelViewer.sourceImageSize = CGSizeMake(1024, 768);
EEPixelViewerPlane plane;
plane.width = 1024;
plane.height = 768;
plane.data = pixelBuffer;
plane.rowBytes = plane.width * 4;
[pixelViewer displayPixelBufferPlanes: &plane count: 1 withCompletion:nil];

Repeat the displayPixelBufferPlanes call for each frame (which loads the pixel buffer to the GPU using glTexImage2D), and that's pretty much all there is to it. The code is smart in that it tries to use the GPU for any kind of simple processing required such as permuting the color channels, converting YpCbCr to RGB, etc.

There is also quite a bit of logic for honoring scaling using the UIView's contentMode property, so UIViewContentModeScaleToFit/Fill, etc. all work as expected.

﹉夏雨初晴づ 2024-08-30 22:47:37

也许您可以将软件渲染器中使用的方法抽象为 GPU 着色器...可能会获得更好的性能。您需要将编码的“视频”数据作为纹理发送。

Perhaps you could abstract the methods used in the software renderer to a GPU shader... might get better performance. You'd need to send the encoded "video" data as a texture.

你对谁都笑 2024-08-30 22:47:37

CGDataProviderglTexSubImage 更快的方法是使用 CVOpenGLESTextureCacheCVOpenGLESTextureCache 允许您直接修改显存中的 OpenGL 纹理,而无需重新上传。

我用它来实现快速动画视图,您可以在这里看到:

https://github.com/justinmeiners /image-sequence-streaming

使用起来有点棘手,我在问了自己关于这个主题的问题后遇到了它:如何直接更新像素 - 使用 CGImage 和直接 CGDataProvider

A faster method than both CGDataProvider and glTexSubImage is to use CVOpenGLESTextureCache. The CVOpenGLESTextureCache allows you to directly modify an OpenGL texture in graphics memory without re-uploading.

I used it for a fast animation view you can see here:

https://github.com/justinmeiners/image-sequence-streaming

It is a little tricky to use and I came across it after asking my own question about this topic: How to directly update pixels - with CGImage and direct CGDataProvider

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文