最坏情况操作系统时钟精度?
我不确定这个问题是否属于 StackOverflow,但它就在这里。
我需要使用 C# 为一些要从一方传输到另一方的数据生成时间戳,并且我需要知道所有操作系统(Windows、Linux 和 Unix)中系统时钟的最坏情况精度是多少?我需要的是计算出精度,以便所有操作系统都能够验证此时间戳。
例如,Windows Vista 操作系统的时钟分辨率约为 10-15 毫秒。
I am not sure if this question belongs on StackOverflow but here it is.
I need to generate a timestamp using C# for some data which is to be transferred from one party to another party and I need to know what is the worst case precision of the system clock in all operating system (Windows, Linux and Unix)? What I need is to figure out the precision such that all operating systems are able to validate this timestamp.
As an example the clock's resolution for Windows Vista operating systems is approximately 10-15 milliseconds.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
您是否希望生成类似 unix 时间戳的内容 的数据?或者找到一个不会与现有文件冲突的时间戳?如果是后者,您可以随时使用蜱虫。
任何“长”时间戳的问题在于,它与生成它的机器相关,但不能保证其他系统上的不冲突,因为时钟可以设置不同(不是浮动的,但实际上设置不同)。
如果数据是安全/敏感的,并且您正在寻找基于时间的同步密钥机制(ALA Kerberos)我不建议自行推出,因为有很多障碍需要克服,特别是在同步系统和保持同步方面。
Are you looking to generate something like a unix timestamp for the data? Or to find a timestamp that wont collide with an existing file? If its the later you could always use ticks.
The problem with any "long" timestamp is that it will be relative to the machine generating it but wont guarantee non-collision on the other system, as the clocks can be set differently (not float, but actually be set differently).
If the data is secure/sensitive and you are looking at a time-based mechanism for sync-ing keys (ALA Kerberos) I would not suggest rolling your own as there are many obstacles to overcome especially in sync-ing systems and keeping them in sync.
有趣的。主要操作系统在最坏的情况下具有厘秒分辨率(0.01 秒),尽管通常嵌入得更精确。
Linux 在时间戳中提供高达微秒的分辨率(请参阅
man utime
),具体取决于计算机的时钟硬件。 Windows NT/Win2K/XP/等。尽管它以0.000 000 1
秒为单位(每秒一千万)计算所有系统时间戳,但文件时间戳提供毫秒精度(仅使用 NTFS)。如果系统之间需要精确的时间分辨率,GPS 接收器可以轻松实现 100 纳秒的精度,如 其工作方式的副作用,许多廉价型号的性能也达到 10 纳秒。 特殊 GPS 模型使导出的时间可供外部使用。
Interesting. The major operating systems have—at worst—centisecond resolution (0.01 seconds), though that's often embedded with more precision.
Linux offers up to microsecond resolution in its timestamps (see
man utime
) depending on the computer's clock hardware. Windows NT/Win2K/XP/etc. offer millisecond precision in file timestamps (using NTFS only) though it accounts for all system timestamps in0.000 000 1
second units (ten million per second).If accurate and precise time resolution is needed between systems, GPS receivers easily achieve 100 nanosecond precision as a side effect of how they work, and many inexpensive models do as well as 10 ns. Special GPS models make the derived time available for external use.