Qtimer on Windows. What's wrong with it???
-
Qtimer seems to not be very precise on Windows. I need a time delay between 10ms and 1s, and depending on the interval, sometimes the delay is even doubled. Why is that??
For example,on my win xp the test below gives an average delay of:
- 47 ms when I set the argument to 33ms
- 12/13 ms for 13ms
- 93 for anything between 80 and 93
- I was tired and decided that these values were alarming enough to stop there :(
@
#include <QDebug>
#include <QTime>
#include <QTimer>
#include <QWidget>
class WriteTime: public QWidget
{
Q_OBJECTprivate:
int shootinterval;
QTime chrono;
QTimer timer;
int count;public:
explicit WriteTime(QWidget* parent, int interval)
:QWidget(parent)
, shootinterval(interval)
, chrono()
, timer()
, count(0)
{
connect(&timer,SIGNAL(timeout()),this,SLOT(updateStuff()));
}void start(){ timer.setInterval(shootinterval); timer.start(); chrono.start(); }
private slots:
void updateStuff(){
count++;
if(count == 100){
qDebug()<< chrono.elapsed()/100;
chrono.restart();
count = 0;
}
}};
int main(int argc, char *argv[])
{
QApplication a(argc, argv);
WriteTime t(0,/whatever you want here/);
t.start();
return a.exec();}
@ -
Hi,
windows is not a realtime OS. A timer is just an event that is emitted. It will be processed by the event loop when the event loop is idle and no other thread is currently active. You can't rely on getting this directly in time. Not on windows and also not on Linux nor Mac.
So your PC is executing other apps and your process also does other stuff (updating UI, printing debug messages, ...)
As you use qDebug for getting the messages, I assume you have the debugger connected, which also has influence on that. You can rely, that the event is send at the correct time, but when it is handled is a total different thing. -
thanks for your replies.
I don't really need 1ms , usually my lowest timer resolution would be 15ms.
I use qDebug() but the debugger is disconnected. I tested it with QCreator so the messages are displayed.Well this test program is far from perfect, I know that.
I understand the notion of preemption, and the program above has only my QTImer::timeout() signal in the event loop most of the time. But the difference in behavior is quite high between the Windows and Linux implementation.I have tried something else:
- I increment the timer value after some amount of iterations
- On linux I get incremental values, sometimes bigger than the timeout as you guys mentioned.
I have to draw the results it will be like a LINE CHART - On windows, the results are more like a CLASS HISTOGRAM: between [x,y] i got y;
-
I know this is an old topic now, but there is no point in me opening a new topic when all the information is already here. But I have a further observation and question:
I get very similar numbers to those from the first post, my timer is aimed at triggering every 60ms , but it hits either of the following times (and no others):
47ms, 62/63ms, 78ms
The other figure from the original post is 93ms.
Each of these are 15/16ms apart. It seems that windows has some sort of time-slots it is using, does anyone know about that? - it can't be just coincidence that we are always hitting these numbers?
-
I don't think this is necessarily a Windows issue, vs. an issue with Qt's implementation of their timer. Windows is apparently capable of high resolution timers - more info here http://msdn.microsoft.com/en-us/library/windows/desktop/ms644900(v=vs.85).aspx