Windows 7定时函数-如何正确使用GetSystemTimeAdjustment?

发布于 2024-12-08 10:04:49 字数 2528 浏览 0 评论 0原文

我在 Windows 7 上使用 GetSystemTimeAdjustment 函数运行了一些测试,并得到了一些我无法解释的有趣结果。据我了解,如果系统时间定期同步,则此方法应该返回,如果是,则更新的时间间隔和增量(请参阅 MSDN 上的 GetSystemTimeAdjustment 函数)。

由此看来,如果我重复查询系统时间,例如使用 GetSystemTimeAsFileTime ,我应该要么没有变化(系统时钟尚未更新),要么变化是检索到的增量的倍数通过GetSystemTimeAdjustment问题一:这个假设正确吗?

现在考虑以下测试代码:

#include <windows.h>
#include <iostream>
#include <iomanip>

int main()
{
    FILETIME fileStart;
    GetSystemTimeAsFileTime(&fileStart);
    ULARGE_INTEGER start;
    start.HighPart = fileStart.dwHighDateTime;
    start.LowPart = fileStart.dwLowDateTime;

    for (int i=20; i>0; --i)
    {
        FILETIME timeStamp1;
        ULARGE_INTEGER ts1;

        GetSystemTimeAsFileTime(&timeStamp1);

        ts1.HighPart = timeStamp1.dwHighDateTime;
        ts1.LowPart  = timeStamp1.dwLowDateTime;

        std::cout << "Timestamp: " << std::setprecision(20) << (double)(ts1.QuadPart - start.QuadPart) / 10000000 << std::endl;

    }

    DWORD dwTimeAdjustment = 0, dwTimeIncrement = 0, dwClockTick;
    BOOL fAdjustmentDisabled = TRUE;
    GetSystemTimeAdjustment(&dwTimeAdjustment, &dwTimeIncrement, &fAdjustmentDisabled);

    std::cout << "\nTime Adjustment disabled: " << fAdjustmentDisabled
        << "\nTime Adjustment: " << (double)dwTimeAdjustment/10000000
        << "\nTime Increment: " << (double)dwTimeIncrement/10000000 << std::endl;

}

它在一个循环中获取 20 个时间戳并将它们打印到控制台。最后它打印系统时钟更新的增量。我希望循环中打印的时间戳之间的差异为 0 或该增量的倍数。但是,我得到这样的结果:

Timestamp: 0
Timestamp: 0.0025000000000000001
Timestamp: 0.0074999999999999997
Timestamp: 0.01
Timestamp: 0.012500000000000001
Timestamp: 0.014999999999999999
Timestamp: 0.017500000000000002
Timestamp: 0.022499999999999999
Timestamp: 0.025000000000000001
Timestamp: 0.0275
Timestamp: 0.029999999999999999
Timestamp: 0.032500000000000001
Timestamp: 0.035000000000000003
Timestamp: 0.040000000000000001
Timestamp: 0.042500000000000003
Timestamp: 0.044999999999999998
Timestamp: 0.050000000000000003
Timestamp: 0.052499999999999998
Timestamp: 0.055
Timestamp: 0.057500000000000002

Time Adjustment disabled: 0
Time Adjustment: 0.0156001
Time Increment: 0.0156001

因此,系统时间似乎是使用大约 0.0025 秒的间隔更新的,而不是 GetSystemTimeAdjustment 返回的 0.0156 秒。

问题二:这是什么原因?

I ran some tests using the GetSystemTimeAdjustment function on Windows 7, and got some interesting results which I cannot explain. As fas as I understand, this method should return if the system time is synchronized periodically and if it is, at which interval and with which increment it is updated (see GetSystemTimeAdjustment function on MSDN).

From this I follow that if I query the system time for example using GetSystemTimeAsFileTime repeatingly I should either get no change (the system clock has not been updated), or a change which is a multiple of the increment retrieved by GetSystemTimeAdjustment. Question one: Is this assumption correct?

Now consider the following testing code:

#include <windows.h>
#include <iostream>
#include <iomanip>

int main()
{
    FILETIME fileStart;
    GetSystemTimeAsFileTime(&fileStart);
    ULARGE_INTEGER start;
    start.HighPart = fileStart.dwHighDateTime;
    start.LowPart = fileStart.dwLowDateTime;

    for (int i=20; i>0; --i)
    {
        FILETIME timeStamp1;
        ULARGE_INTEGER ts1;

        GetSystemTimeAsFileTime(&timeStamp1);

        ts1.HighPart = timeStamp1.dwHighDateTime;
        ts1.LowPart  = timeStamp1.dwLowDateTime;

        std::cout << "Timestamp: " << std::setprecision(20) << (double)(ts1.QuadPart - start.QuadPart) / 10000000 << std::endl;

    }

    DWORD dwTimeAdjustment = 0, dwTimeIncrement = 0, dwClockTick;
    BOOL fAdjustmentDisabled = TRUE;
    GetSystemTimeAdjustment(&dwTimeAdjustment, &dwTimeIncrement, &fAdjustmentDisabled);

    std::cout << "\nTime Adjustment disabled: " << fAdjustmentDisabled
        << "\nTime Adjustment: " << (double)dwTimeAdjustment/10000000
        << "\nTime Increment: " << (double)dwTimeIncrement/10000000 << std::endl;

}

It takes 20 timestamps in a loop and prints them to the console. In the end it prints the increment with which the system clock is updated. I would expect the differences between the timestamps printed in the loop to be either 0 or multiples of this increment. However, I get results like this:

Timestamp: 0
Timestamp: 0.0025000000000000001
Timestamp: 0.0074999999999999997
Timestamp: 0.01
Timestamp: 0.012500000000000001
Timestamp: 0.014999999999999999
Timestamp: 0.017500000000000002
Timestamp: 0.022499999999999999
Timestamp: 0.025000000000000001
Timestamp: 0.0275
Timestamp: 0.029999999999999999
Timestamp: 0.032500000000000001
Timestamp: 0.035000000000000003
Timestamp: 0.040000000000000001
Timestamp: 0.042500000000000003
Timestamp: 0.044999999999999998
Timestamp: 0.050000000000000003
Timestamp: 0.052499999999999998
Timestamp: 0.055
Timestamp: 0.057500000000000002

Time Adjustment disabled: 0
Time Adjustment: 0.0156001
Time Increment: 0.0156001

So it appears that the system time is updated using an interval of about 0.0025 seconds and not 0.0156 seconds as return by GetSystemTimeAdjustment.

Question two: What is the reason for this?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

jJeQQOZ5 2024-12-15 10:04:49

GetSystemTimeAsFileTimeAPI 提供对文件时间格式的系统挂钟的访问。

64 位 FILETIME 结构以 100 ns 为单位接收系统时间作为 FILETIME,该系统时间自 1601 年 1 月 1 日起已过期。对 GetSystemTimeAsFileTime 的调用通常需要 10 ns 到 15 ns。

为了调查此 API 提供的系统时间的真实准确性,需要讨论时间值附带的粒度。换句话说:系统时间多久更新一次?第一个估计是由隐藏的 API 调用提供的:

NTSTATUS NtQueryTimerResolution(OUT PULONG MinimumResolution, 
                                OUT PULONG MaximumResolution, 
                                OUT PULONG ActualResolution);

NtQueryTimerResolution 由本机 Windows NT 库 NTDLL.DLL 导出。此调用报告的 ActualResolution 表示以 100 ns 为单位的系统时间更新周期,它不一定与中断周期匹配。该值取决于硬件平台。常见硬件平台的实际分辨率报告为 156,250 或 100,144;较旧的平台可能会报告更大的数字;较新的系统,特别是当支持 HPET(高精度事件计时器)或恒定/不变 TSC 时,可能会为 ActualResolution 返回 156,001。

这是控制系统的心跳之一。 MinimumResolutionActualResolution 与多媒体定时器配置相关。

ActualResolution 可以通过使用 API 调用

NTSTATUS NtSetTimerResolution(IN ULONG RequestedResolution,
                              IN BOOLEAN Set,
                              OUT PULONG ActualResolution);

或通过多媒体定时器接口

MMRESULT timeBeginPeriod(UINT uPeriod);

进行设置,其中 uPeriod 的值源自

MMRESULT timeGetDevCaps(LPTIMECAPS ptc, UINT cbtc );

填充结构

typedef struct {
  UINT wPeriodMin;
  UINT wPeriodMax;
} TIMECAPS;

所允许的范围。wPeriodMin的典型值为 1 ms,wPeriodMax 的典型值为 1,000,000 ms 。

在此处查看最小/最大值时,存在一个不幸的误解:

  • wPeriodMin 定义了最小周期,这在上下文中很清楚。
  • 另一方面,NtQueryTimerResolution 返回的 MinimumResolution 指定分辨率。可获得的最低分辨率(MinimumResolution)最高可达约 20 毫秒,而可获得的最高分辨率(MaximumResolution)可为 0.5 毫秒。但是,无法通过 timeBeginPeriod 访问 0.5 毫秒结果。

多媒体定时器接口处理周期,NtQueryTimerResolution() 处理分辨率(周期的倒数)。

摘要: GetSystemTimeAdjustment 不是要查看的函数。该函数仅告诉如何以及是否进行时间更改。根据多媒体计时器接口timeBeginPeriod的设置,时间的进展可能会更频繁地进行并且分更小的部分进行。使用NtQueryTimerResolution接收实际时间增量。请注意,多媒体计时器 API 的设置确实会影响这些值。 (例如:当媒体播放器正在播放视频时,时间越来越短。)

我诊断出 Windows 时间在很大程度上很重要。部分结果可以在此处找到。

注意:时间调整:0.0156001可以清楚地识别系统上具有HPET和/或恒定/不变TSC的Windows VISTA或更高版本。

实现:如果你想捕获时间转换:

FILETIME FileTime,LastFileTime;
long long DueTime,LastTime;
long FileTimeTransitionPeriod; 

GetSystemTimeAsFileTime(&FileTime);
for (int i = 0; i < 20; i++) {
  LastFileTime.dwLowDateTime = FileTime.dwLowDateTime;
  while (FileTime.dwLowDateTime == LastFileTime.dwLowDateTime) GetSystemTimeAsFileTime(&FileTime); 
  // enough to just look at the low part to catch the transition
  CopyMemory(&DueTime,&FileTime,sizeof(FILETIME));
  CopyMemory(&LastTime,&LastFileTime,sizeof(FILETIME));
  FileTimeTransitionPeriod = (long)(DueTime-LastTime);
  fprintf(stdout,"transition period: % 7.4lf ms)\n",(double)(FileTimeTransitionPeriod)/10000);
}   

// WARNING: This code consumes 100% of the cpu for 20 file time increments.
// At the standard file time increment of 15.625 ms this corresponds to 312.5ms!

但是:当文件时间转换非常短时(例如由timeBeginPeriod(wPeriodMin)设置)任何像 fprintfstd::cout 这样的输出都可能会破坏结果,因为它会延迟循环。在这种情况下,我建议将 20 个结果存储在数据结构中,然后再进行输出。

并且:文件时间转换可能并不总是相同。文件时间增量很可能与更新周期不匹配。请参阅上面的链接以获取此行为的更多详细信息和示例。

编辑: 调用 timeBeginPeriod 时要小心,因为频繁调用会严重影响系统时钟 MSDN。此行为适用于 Windows 版本 7。

调用 timeBeginPeriod/timeEndPeriodNtSetTimerResolution 可能会将系统时间更改为 ActualResolution 。经常这样做会导致系统时间发生相当大的变化。然而,当在系统时间转换时或接近系统时间转换时进行调用时,偏差会小得多。对于 NTP 客户端等要求较高的应用程序,建议在调用上述函数之前轮询系统时间转换/增量。当系统时间进程中发生不必要的跳跃时,同步到 NTP 服务器会很困难。

The GetSystemTimeAsFileTimeAPI provides access to the system's wall clock in file time format.

A 64-bit FILETIME structure receives the system time as FILETIME in 100ns units, which have been expired since Jan 1, 1601. The call to GetSystemTimeAsFileTime typically requires 10 ns to 15 ns.

In order to investigate the real accuracy of the system time provided by this API, the granularity that comes along with the time values needs to be discussed. In other words: How often is the system time updated? A first estimate is provided by the hidden API call:

NTSTATUS NtQueryTimerResolution(OUT PULONG MinimumResolution, 
                                OUT PULONG MaximumResolution, 
                                OUT PULONG ActualResolution);

NtQueryTimerResolution is exported by the native Windows NT library NTDLL.DLL. The ActualResolution reported by this call represents the update period of the system time in 100 ns units, which does not necessarily match the interrupt period. The value depends on the hardware platform. Common hardware platforms report 156,250 or 100,144 for ActualResolution; older platforms may report even larger numbers; newer systems, particulary when HPET (High Precision Event Timer) or constant/invariant TSC are supported, may return 156,001 for ActualResolution.

This is one of the heartbeats controlling the system. The MinimumResolution and the ActualResolution are relevant for the multimedia timer configuration.

The ActualResolution can be set by using the API call

NTSTATUS NtSetTimerResolution(IN ULONG RequestedResolution,
                              IN BOOLEAN Set,
                              OUT PULONG ActualResolution);

or via the multimedia timer interface

MMRESULT timeBeginPeriod(UINT uPeriod);

with the value of uPeriod derived from the range allowed by

MMRESULT timeGetDevCaps(LPTIMECAPS ptc, UINT cbtc );

which fills the structure

typedef struct {
  UINT wPeriodMin;
  UINT wPeriodMax;
} TIMECAPS;

Typical values are 1 ms for wPeriodMin and 1,000,000 ms for wPeriodMax.

There is an unfortunate misinterpretation when looking an the min/max values here:

  • wPeriodMin defines the minimum period, which is clear in this context.
  • MinimumResolution returned by NtQueryTimerResolution on the other hand specifies a resolution. The lowest obtainable resolution (MinimumResolution) is in the range of up to about 20 ms, while the highest obtainable resolution (MaximumResolution) can be 0.5 ms. However, the 0.5 ms resulution is not accessable through a of timeBeginPeriod.

The multimedia timer interface handles periods and NtQueryTimerResolution() handles resolutions (reciprocal value of period).

Summary: GetSystemTimeAdjustment is not the function to look at. This function only tells how and if time-changes are done. Depending on the setting of the multimedia timer interface timeBeginPeriod, the progress of time may be done more often and in smaller portions. Use NtQueryTimerResolution to receive the actual time increment. And be aware that the setting of the multimedia timer API does influence the values. (Example: When the media player is showing a video, the times are getting short.)

I diagnosed windows time matters to a large extent. Some of the results can be found here.

Note: Time Adjustment: 0.0156001 clearly identifies windows VISTA or higher with HPET and/or constant/invariant TSC on your system.

Implementation: If you want to catch the time transition:

FILETIME FileTime,LastFileTime;
long long DueTime,LastTime;
long FileTimeTransitionPeriod; 

GetSystemTimeAsFileTime(&FileTime);
for (int i = 0; i < 20; i++) {
  LastFileTime.dwLowDateTime = FileTime.dwLowDateTime;
  while (FileTime.dwLowDateTime == LastFileTime.dwLowDateTime) GetSystemTimeAsFileTime(&FileTime); 
  // enough to just look at the low part to catch the transition
  CopyMemory(&DueTime,&FileTime,sizeof(FILETIME));
  CopyMemory(&LastTime,&LastFileTime,sizeof(FILETIME));
  FileTimeTransitionPeriod = (long)(DueTime-LastTime);
  fprintf(stdout,"transition period: % 7.4lf ms)\n",(double)(FileTimeTransitionPeriod)/10000);
}   

// WARNING: This code consumes 100% of the cpu for 20 file time increments.
// At the standard file time increment of 15.625 ms this corresponds to 312.5ms!

But: When the filetime transition is very short (e.g. set by timeBeginPeriod(wPeriodMin)) any output like fprintf or std::cout might destroy the result because it delays the loop. In such cases I'd recommend to store the 20 results in a data structure and do the output afterwards.

And: The filetime transition may not always be the same. It may well be that the file time increment does not match the update period. See the link above to get more details and examples for this bahavior.

Edit: Use caution when calling timeBeginPeriod, as frequent calls can significantly affect the system clock MSDN. This behavior applies up to Windows version 7.

Calls to timeBeginPeriod/timeEndPeriod or NtSetTimerResolution may change the system time by as much as ActualResolution. Doing it very often results in considerable changes of the system time. However, when the calls are made at or near the transition of the system time, deviations are much less. Polling for a system time transition/increment ahead of calls to the above function is advised for demanding applications like NTP clients. Synchronizing to an NTP server is difficult when unwanted jumps in the systemtime progess occurs.

潜移默化 2024-12-15 10:04:49

您实际上正在分析一次通过 for() 循环所需的时间。我得到了更多的可变性,但 5 毫秒左右是正确的,控制台输出不是很快。随意添加更多 std::cout 语句以减慢速度。

You are actually profiling how long one pass through the for() loop takes. I get some more variability but 5 milliseconds is about right, console output is not very fast. Arbitrarily add some more std::cout statements to slow it down.

涙—继续流 2024-12-15 10:04:49

GetSystemTimeAsFileTime 的分辨率取决于系统。如果看到它声称它在 10 毫秒到 55 毫秒之间。 MSDN 文档 将其设置为 15ms 和“亚毫秒”。它实际上是什么似乎还不清楚,但我从未见过它的分辨率声称等于时间戳的 100 ns 精度。

这意味着总会存在一些差异,这也是人们使用 QueryPerformanceFrequency 代替。

GetSystemTimeAsFileTime's resolution is dependent on the system. If seen it claimed that its between 10ms and 55ms. Commentators on the MSDN document put it at 15ms and "sub millisecond". What it actually is seems unclear but I've never seen its resolution claimed as equal to the 100 ns precision of the timestamp.

This means there's always going to be some variance and its also the reason people use QueryPerformanceFrequency instead.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文