QBasicTimer time resolution
-
Good day, colleagues!
I'm writing a program that implemets RFC 2544 network test. As the part of the test, I must send UDP packets at specified rate. For example, I should send 64 byte packets at 1Gb/s. That means that I should send UDP packet every 0,5 microseconds. Can I use QBasicTimer to obtain such time resolution or I should use platform-specific instruments?
-
I don't know QBasicTimer but its for internal use and for me it can't run at 0,5 microseconds.
Cosiderate QTimer but remember that 'the accuracy of timers depends on the underlying operating system and hardware. Most platforms support a resolution of 1 millisecond, though the accuracy of the timer will not equal this resolution in many real-world situations.' -
I thought that as QBasicTimer is more lightweight than QTimer, I should prefer QBasicTimer to obtain more accurate results.
Also, "QT Timer tutorial":http://doc.qt.nokia.com/4.7/timers.html says that usage of QBasicTimer will be "easy optimization" of QTimer.
By all means, it seems that I can't obtain more than 1 ms resolution using either QTimer or QBasicTimer. Am I right?
-
The resolution of the timers is rather platform dependent. On Windows it is even tough to get more accurate than 10ms. If you really depend on a .5 us time interval (2MHz, that's serious business), you will need hardware support. The IP layer probably cannot even support such accurate timings. You should rethink the need to send out a packet very .5 us and more likely worry more about sending a bunch of packets every so often, or start a thread that just sends the packets after one another.
-
Hey this is interesting however i have done some performance calculation on my app. And i show time difference for function elapsed time in milliseconds.
And my log contains 2 ms to 5 ms on S60 platform with 600 Ghz processor as elapsed time calculated using QElapsedTime and QTime::CurrentTime().so does that mean these values are not the exact values ?
-
600GHz? wow :P. The clock frequency of your processor has little to do with it. On linux for example the kernel task scheduler has a configurable speed, 1kHz (T = 1ms) for desktop use is maximum. I'm thereby saying that timing on desktop platforms or any platform that should be real-time enough for human use (linux, windows, mac os that sorta thing) is not fine grained enough to be trusted with high precision timing. There is a lot of scheduling involved that you typically do not notice, but during those two to five milliseconds, your system will probably have switched between different programs and threads multiple times, sometimes leaving your program or library hanging somewhere between two instructions. If you really require fine grained timing (where accuracy is measured in microseconds), you are looking at a real-time OS or doing your own scheduling. In these cases the timings can be made deterministic (e.g. an interrupt is guaranteed to be handled within 50 us). The scheduling timings are usually actually defined in time units, not clock cycles. A higher clock speed then means that your program can do more in the time it is given to process, but if it is interrupted in the middle of an operation because time's up for now, you're still going to wait a few ms before you get CPU time again.
This is probably a lot of information and some of it may not even be interesting to you. The core thing is to not trust the system to be able to time more accurate than a few ms. The accuracy may be good, but the precision may not be ("Wikipedia on Accuracy and Precision":http://en.wikipedia.org/wiki/Accuracy_and_precision), or the other way around.
[EDIT: fixed link, Volker]
-
Thanks Franzk, this is nice info BTW thats 600 MHz,
-
[quote author="usamytch" date="1298386132"]Good day, colleagues!
I'm writing a program that implemets RFC 2544 network test. As the part of the test, I must send UDP packets at specified rate. For example, I should send 64 byte packets at 1Gb/s. That means that I should send UDP packet every 0,5 microseconds. Can I use QBasicTimer to obtain such time resolution or I should use platform-specific instruments?[/quote]
Go with platform specific code. Qt doesn't offer much to go under the millisecond resolution, while on any modern desktop (with HPET timings) is easy to get 10-100 nanosecond resolution.