iOS 游戏和运行循环管理
首先,我的问题是:你如何管理你的 iOS Run-Loop?
接下来是我的理由:我一直在研究各种原型(相对于早期开发),并发现了许多令人困惑的问题。
- 首先,输入问题和运行循环导致我尝试以下操作:
- 当使用最推荐的系统 (CADisplayLink) 时,我注意到一旦 CPU 负载导致缓冲区翻转 (presentRenderBuffer) 必须等待一帧,某些触摸输入就会被丢弃。这种情况仅发生在设备上,而不是在模拟器中(令人烦恼 - 这似乎与主线程上等待垂直同步阻塞以及应用程序运行循环处理触摸输入和吃消息的方式有关)
- 当使用下一个最推荐的系统(NSTimer)时,我注意到,一旦 CPU 负载达到模拟器中的某个点,某些触摸输入就会被丢弃,但在设备中却不会(也很烦人)。当我的更新触发时,NSTimer 也会导致精度低得多
- 当使用最不推荐的系统时(使用从 mach_absolute_time 构建的高精度计时器在内部管理的自己的线程中运行运行循环,我所有的触摸输入问题都消失了,但是我的 ASSERT 代码现在陷入了错误的线程,并且仅如果我在软件中断后睡眠(我的断言代码类似于 http://iphone.m20 .nl/wp/?p=1)我真的很喜欢让我的断言代码立即陷入导致问题的行,所以这个解决方案对我来说并不真正可行:更难调试。
- 其次,
- 在调查系统时,我发现无论帧速率如何(奇怪的是,但我认为从统计角度看,使用垂直同步仍然有意义),我大约有 22% 的时间在等待垂直同步。我通过移动 glFlush/glFinish 并调整执行presentRenderBuffer 调用的频率来确认这一点。这是我想要处理 AI 等的关键时刻,而不是简单地停滞在阻塞的 gl 调用上。我能想到的解决这个问题的唯一方法是将渲染移动到它自己的线程中,但我不确定是否有必要在单处理器设备上开始重新构建多线程。
那么有人找到解决这些问题的灵丹妙药了吗?有没有人拥有在这个平台上表现出色的杀手级运行循环架构?目前看来我必须两害相权取其轻。
First, my question: How do you manage your iOS Run-Loop?
Next my reason: I've been researching this with a variety of prototypes (v. early stage development) and have found a number of perplexing issues.
- First, input issues and the run loop lead me to try the following:
- when using the most recommended system (CADisplayLink) I noted that certain touch inputs are dropped once the CPU load causes the buffer flip (presentRenderBuffer) to have to wait a frame. This occurs only on the device and not in the simulator (annoyingly - this seems to be related to wait for vsync blocking on the main thread & the way the app run-loop process touch input & eats messages)
- when using the next most recommended system (NSTimer) I noted that certain touch inputs are dropped once the CPU load reaches a certain point in the simulator but not in the device (also annoyingly). NSTimer also results in much lower precision on when my updates fire
- when using the least recommended system (running the run loop in it's own thread managed internally with a high-precision timer built from mach_absolute_time, all my touch input problems go away, however my ASSERT code now traps in the wrong thread and only if I usleep following the software interrupt. (My assert code is similar to http://iphone.m20.nl/wp/?p=1) I really like having my assert code trap immediately at the line that caused the problem, so this solution is not really workable for me: harder to debug.
- Second, lost time:
- while investigating the system, I found that regardless of framerate (bizarrely, but I suppose statistically it still makes sense w/vsync) I'm waiting approximately 22% of the time on the vsync. I've confirmed this by moving around glFlush/glFinish and by playing with how often I do the presentRenderBuffer calls. This is key time that I'd love to be processing AI, etc rather than simply stalling on a blocking gl call. The only way I can think of around this would involve moving rendering into it's own thread, but I'm not sure if it's warranted to start re-architecting for multi-threading on a single-processor device.
So has anyone found a magic bullet around these issues? Does anyone have a killer run-loop architecture that's kick-ass on this platform? At the moment it looks like I have to pick the lesser of the evils.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
对于我自己的 iOS 项目,我使用经典方法(创建一个窗口 .nib,创建一个继承
EAGLView
的类,将EAGLView
添加到视图控制器中的视图,该视图控制器放置在在它自己的 .nib 中)。在工作中,受 SDL 启发,我采取了一种略有不同的方法,您可以在我们的开源库 APRIL 中检查该方法。 APRIL 的主要目标是支持尽可能多的平台,同时保持简单性(仅窗口和输入管理)并明确许可问题和免费使用。我们的开发人员希望在一个平台(Windows、Mac 或 Linux,根据喜好和愿望)上编写应用程序,然后将代码交给我以适应其他平台。
在我们在 APRIL 中使用的方法中,您不创建任何 .nib,并且在调用
UIApplicationMain
时,您将委托类指定为其第四个参数。对于每个平台,游戏的主要代码都保持完全相同,只有特定于平台的内容才会被#ifdef
添加到代码中,或者抽象到辅助库中。在应用程序委托中,您创建视图控制器和窗口:
注意到我们如何将启动延迟 0.2?这就是我上面提到图像视图的原因。在这 0.2 秒内,我们会在 Default.png 之后立即显示空白屏幕,并且在控制权转移到 runMain: 之前会引入额外的延迟,这会将控制权释放给主应用程序:
因此,现在控制权永远不会转移回 UIApplication 的实际的主循环。然后您创建自己的主循环。
(顺便说一句,当然,使用视图控制器来简化 UI 的旋转以匹配设备方向。)
如果操作系统支持,这两种方法都使用 CADisplayLink。尽管我的私人项目主要基于加速度计,但我没有注意到这两种方法有任何问题。我怀疑四月的方法也可能会让一些问题消失。
For my own iOS projects, I use the classic approach (create a window .nib, create a class inheriting
EAGLView
, addEAGLView
to a view in a view controller which is placed in its own .nib).At work, I took a slightly different approach inspired by SDL, which you can inspect in our opensourced library, APRIL. Main goal of APRIL is support for as many platforms as possible, while retaining simplicity (window and input management only) and being clear about licensing issues and free to use. Our developers want to write apps on one platform (Windows, Mac or Linux, according to tastes and desires) and then the code is handed over to me to adapt for other platforms.
In the approach we use in APRIL, you don't create any .nibs, and upon calling
UIApplicationMain
, you specify the delegate class as its fourth argument. Main code of game remains absolutely the same for each platform, and only platform-specific stuff is#ifdef
'd into the code, or abstracted in a helper library.In the app delegate you create the view controller and the window:
Notice how we delay launching by 0.2? That's why I mention image view above. During those 0.2 seconds, we'd have blank screen displayed immediately after Default.png, and extra delay is introduced before control is transferred to runMain:, which releases control to the main app:
So, now the control is never transferred back to UIApplication's actual main loop. You then create your own main loop.
(On a side note, view controller is used, of course, to simplify rotation of UI to match device orientation.)
Both of these approaches use
CADisplayLink
if supported by the OS. I have not noticed any issues with either of the methods, although my private projects are primarily accelerometer based. I suspect APRIL approach might make some of the problems go away, too.