使用Settimer方法高速读取数据的问题
我在工作中使用带有 ftdi 芯片的 FIFO 模式硬件和 D2xx 驱动程序。对于不同的设置,硬件以 19.5KHz 至 312.5KHz 的速率发送数据字节。所以我的应用程序软件(MFC C++)应该以不同的速度读取这些字节。我使用Settimer和Ontimer方法,值为10毫秒。所以每10毫秒我读取数据字节并在OnTimer函数中进行一些处理。我的问题是
1>使用 settimer 方法,即使两个系统都是 XP SP3,不同的系统也会给出不同的结果。一个系统正在读取所有字节,没有任何丢失,但在另一个系统中数据会丢失。那么这个定时器是依赖于操作系统还是系统硬件?
2>据我了解,我可以为 settimer 设置的最小值是 10 毫秒,因此每 10 毫秒我可以读取数据。如果我读取速度不快,驱动程序缓冲区中将会出现我无法控制的溢出。所以我可以在微秒内读取得更快吗或纳秒与任何计时器方法或有任何其他方法? 请给我一些建议...提前致谢
I am using the hardware with ftdi chip in FIFO mode and D2xx driver for my work. Hardware is sending data bytes at the rate of 19.5KHz to 312.5 KHz for different settings. So my application software(MFC C++) is supposed to read these bytes coming at various speed.I am using Settimer and Ontimer method with value of 10 ms.So every 10 ms i am reading the data bytes and do some processing in OnTimer function. My questions are
1>with settimer method different systems are giving different results even though both the systems are XP SP3. One system is reading all the bytes without any missing but in another system data will miss. So is this timer is depend on OS or system hardware?
2> From what i understand minimum value i can set for settimer is 10 ms So every 10 ms i can read the data.If i do not read fast there will be overflow in driver buffer which i cannot control.So can i read more faster in microsecond or nano seconds with Any timer method or is there any other method?
Please suggest me some idea... Thanks in advance
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
在单独的线程中保持对 Read() 的阻塞可能会更可靠。
SetTimer()
仅适用于低分辨率工作。它的分辨率实际上可以根据较新版本的 Windows 中的电源设置进行缩减。如果您需要高分辨率计时器,计时器队列或多媒体计时器(具体来说,
timeSetEvent()
) 是可行的方法,两者的分辨率都可以低至 1ms。It will probably be more reliable to keep blocking on
Read()
in a separate thread.SetTimer()
is meant only for low-res work. Its resolution can actually be scaled back depending on power settings in more recent versions of Windows.If you want high-resolution timers, Timer Queues or Multimedia Timers (specifically,
timeSetEvent()
) are the way to go, both of which can have a resolution down to 1ms.SetTimer 的问题是,您不会早于您设置的时间收到计时器通知,但您不能保证它会准确地在您设置的时间。
首先,取决于系统的计时器分辨率,通常为 15 毫秒。因此,即使您设置 10 毫秒,它也会以 15 毫秒(或更长)为刻度。
其次,当没有更多消息需要处理时,将发送计时器通知。因此,在消息循环中,当队列中没有更多消息时,系统会查看是否设置了任何计时器以及计时器是否已计时。如果有,它会发送一个通知。问题是,如果你的程序正忙于做其他事情,它可能会丢失一个或多个“刻度”并将它们“分组”到一个通知中,这可能会在 75 毫秒时发生。
底线是 SetTimer 并不是满足高分辨率需求的可靠计时器方法。
因此,您可能需要考虑其他解决方案,例如使用单独的线程来进行读取。
The problem with SetTimer is that you will get timer notifications no sooner than the time you set, but you can't guarantee that it will be at exactly the time you set.
First, you depend on the system's timer resolution, which is usually 15 ms. So, even if you set 10 ms, it will tick at 15 ms (or more).
Second, timer notifications are sent when there are no more messages to process. So, in the message loop, when there are no more messages in the queue, the systems looks if there's any timer set and if it has ticked. If it has, it sends one notification. The thing is, if your program is busy doing other things, it might lose one or several "ticks" and "group" them in one notification, which can happen at, say, 75 ms.
The bottom line is SetTimer is not a reliable timer method for high resolution needs.
So you may have to look at other solutions, like a separate thread to do the reading.