如何减少 Android 中的 Thread.sleep() 延迟
我正在 Android 应用程序的(非 UI)线程中循环生成计时事件,并且我需要这些事件以精确的时间间隔发生(这里的精确意味着变化不超过 +/- 5 毫秒)。用户可以察觉到任何 +/-10 毫秒(当然还有 +/-20 毫秒)的误差。在这个循环的顶部,我做了一些其他计算,这些计算需要不同的时间,但在循环的底部,我需要事件在预先计算的时间发生。
我的非 UI 线程的一次尝试的高度简化版本(没有异常处理)如下:
public final void run() {
long loopTime = 2500L;
long eventTime = (System.nanoTime() / 100000L) + loopTime;
while (true) {
calcutionsTakingVaryingAmountOfTime(); // takes 200 millisecs or less
long eventWait = eventTime - (System.nanoTime() / 100000L);
Thread.sleep(eventWait / 10L);
listener.onEvent();
eventTime = eventTime + loopTime;
}
}
需要精确计时的是对 listener.onEvent()
的调用。
在上面的示例中,定时变量 loopTime
、eventTime
和 eventWait
以十分之一毫秒为单位测量时间。测量当前时间的表达式 (System.nanoTime() / 100000L)
同样以十分之一毫秒为单位。
我绝对确定 calcutionsTakingVaryingAmountOfTime()
总是 花费的时间少于 200 毫秒,并且调用 listener.onEvent()
只需几毫秒。因此,按照目前的情况,将 loopTime
设置为 2500L
时,我的事件应该每 250 毫秒发生一次。
我已经对我的代码(未显示)进行了测试,以将 Thread.sleep()
唤醒时间中的延迟打印到 Log.d()
中。也就是说,我
long latency = (System.nanoTime() / 100000L) - eventTime
在 Thread.sleep()
返回后立即计算,并将其打印到 Log.d()
。
当我在模拟器中运行这个程序时,我发现延迟
(除以 10 将结果转换为毫秒后)通常在循环中的连续传递中在 1 到 50 毫秒之间跳跃,偶尔会出现延迟值高达半秒。当在实际设备上运行时,情况要好得多,但仍然有点不稳定(即使如此,模拟器的行为让我想知道这是否会发生在用户的设备上)。
为了尝试稳定我的事件并控制延迟,我尝试了其他几种方法:
用对
this.wait 的调用替换了
(完全不恰当地使用 wait(),我知道)Thread.sleep(eventWait / 10L)
调用(eventWait / 10L)我在进入循环之前操纵了线程优先级,调用了 Process。 setThreadPriority(Process.THREAD_PRIORITY_URGENT_AUDIO)就像在所有android库中完成的那样
,但是这些延迟没有任何改善。
稳定事件并将延迟减少到小于 2 或 3 毫秒并且很少出现问题的一种方法是通过轮询替换 Thread.sleep()
调用Loop:
while ((System.nanoTime() / 100000L) < eventTime)
;
一方面,我觉得像醉酒的水手一样在自由上花费机器周期是很尴尬的。另一方面,我开始认为没有更好的方法,我应该在轮询循环中燃烧机器周期以减少延迟,并满足我的规范。当然,当我的应用程序进入后台时,我会暂停我的线程,因此这个轮询循环可以工作。但这真是浪费。
任何想法将不胜感激。
I am generating timing events in a loop in a (non-UI) thread in an Android application, and I need those events to occur at precise intervals in time, (precise here means not varying any more than +/- 5 millisecs). Any errors of +/-10 millisecs (and certainly +/- 20 millisecs) can be perceived by the user. At the top of this loop I do some other calculations that take a variable amount of time, but at the bottom of the loop, I need the event to occur at a pre-calculated time.
A higly simplified version (without exception handling) of one attempt at my non-UI thread is below:
public final void run() {
long loopTime = 2500L;
long eventTime = (System.nanoTime() / 100000L) + loopTime;
while (true) {
calcutionsTakingVaryingAmountOfTime(); // takes 200 millisecs or less
long eventWait = eventTime - (System.nanoTime() / 100000L);
Thread.sleep(eventWait / 10L);
listener.onEvent();
eventTime = eventTime + loopTime;
}
}
It is the call to listener.onEvent()
that needs to be precisely timed.
In the example above, the timing variables loopTime
, eventTime
, and eventWait
measure time in tenths of a millisec. The expressions (System.nanoTime() / 100000L)
measuring current time are likewise in tenths of a millisec.
I am absolutely certain that calcutionsTakingVaryingAmountOfTime()
always takes less than 200 millisecs, and the call to listener.onEvent()
is just a few millisecs. So as it stands, with loopTime
set to 2500L
, my events ought to occur every 250 millisecs.
I have intstrumented my code (not shown) to print to Log.d()
the latency in the Thread.sleep()
wake up time. That is, I calculate
long latency = (System.nanoTime() / 100000L) - eventTime
immediately after the return from Thread.sleep()
, and print it to Log.d()
.
When I run this in the emulator, what I find is that latency
(after dividing by 10 to get the result into millisecs) is normally jumping between 1 and 50 millisecs in successive passes through the loop, with occasional values as high as a half a second. When run on an actual device, things are quite a bit better, but still a little wobbly (and even still, the emulator behavior makes me wonder if this is going to happen on users' devices).
To try to steady my event and control the latency, I tried several other approaches:
In replaced the
Thread.sleep(eventWait / 10L)
call, with a call tothis.wait(eventWait / 10L)
(a completely inappropriate use of wait(), I know)I manipulated the thread priorities prior to entering the loop, calling
Process.setThreadPriority(Process.THREAD_PRIORITY_URGENT_AUDIO)
like is done all over the android libraries
But there was no improvement at all in the latency with these.
The one approach that steadys the event, and reduces the latency to less than 2 or 3 millisecs, and rarely hiccups is to replace the Thread.sleep()
call by a polling loop:
while ((System.nanoTime() / 100000L) < eventTime)
;
On the one hand, I feel embarrassed spending machine cycles like a drunken sailor on liberty. On the other hand, I am beginning to think there is no better way, and I should burn the machine cycles in the polling loop to reduce my latency, and meet my specification. Of course, when my app goes to the background, I pause my thread, so this polling loop works. But what a waste.
Any ideas would be greatly appreciated.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我将延迟消息与
Handler
一起用于类似目的。这可能有点矫枉过正。在你的情况下,我会看看Timer
类。这使我在模拟器上的延迟为 +-2 毫秒。
I'm using delayed messages with
Handler
for similar purpose. It can be little overkill. In you case I would take a look toTimer
class.This gives me latency +-2 millisecs at emulator.