Important: Please read the Qt Code of Conduct -

Qtimer on Windows. What's wrong with it???

  • Qtimer seems to not be very precise on Windows. I need a time delay between 10ms and 1s, and depending on the interval, sometimes the delay is even doubled. Why is that??

    For example,on my win xp the test below gives an average delay of:

    • 47 ms when I set the argument to 33ms
    • 12/13 ms for 13ms
    • 93 for anything between 80 and 93
    • I was tired and decided that these values were alarming enough to stop there :(

    #include <QDebug>
    #include <QTime>
    #include <QTimer>
    #include <QWidget>
    class WriteTime: public QWidget

    int shootinterval;
    QTime chrono;
    QTimer timer;
    int count;

    explicit WriteTime(QWidget* parent, int interval)
    , shootinterval(interval)
    , chrono()
    , timer()
    , count(0)

    void start(){

    private slots:
    void updateStuff(){
    if(count == 100){
    qDebug()<< chrono.elapsed()/100;
    count = 0;


    int main(int argc, char *argv[])
    QApplication a(argc, argv);
    WriteTime t(0,/whatever you want here/);
    return a.exec();


  • Hi,

    windows is not a realtime OS. A timer is just an event that is emitted. It will be processed by the event loop when the event loop is idle and no other thread is currently active. You can't rely on getting this directly in time. Not on windows and also not on Linux nor Mac.

    So your PC is executing other apps and your process also does other stuff (updating UI, printing debug messages, ...)
    As you use qDebug for getting the messages, I assume you have the debugger connected, which also has influence on that. You can rely, that the event is send at the correct time, but when it is handled is a total different thing.

  • Gerolf is right. Further more, especially windows has a limited resolution for the timers anyway. You simply can not get 1ms accuracy.

  • 1ms accuracy is a challenge even on Unix boxes. The simple fact is that for real-life desktop applications you never actually should have to bother with this.

  • Think of QTimer as to only guarantee that the timeout triggers not before the set interval :-) (or at the fastest opportunity possible in case of timeout = 0).

  • thanks for your replies.
    I don't really need 1ms , usually my lowest timer resolution would be 15ms.
    I use qDebug() but the debugger is disconnected. I tested it with QCreator so the messages are displayed.

    Well this test program is far from perfect, I know that.
    I understand the notion of preemption, and the program above has only my QTImer::timeout() signal in the event loop most of the time. But the difference in behavior is quite high between the Windows and Linux implementation.

    I have tried something else:

    • I increment the timer value after some amount of iterations
    • On linux I get incremental values, sometimes bigger than the timeout as you guys mentioned.
      I have to draw the results it will be like a LINE CHART
    • On windows, the results are more like a CLASS HISTOGRAM: between [x,y] i got y;

  • 15 ms is already tricky. Windows systems can't be more precise than 10 to 16 ms at best. Linux desktop systems usually can make 1 ms.

  • I know this is an old topic now, but there is no point in me opening a new topic when all the information is already here. But I have a further observation and question:

    I get very similar numbers to those from the first post, my timer is aimed at triggering every 60ms , but it hits either of the following times (and no others):

    47ms, 62/63ms, 78ms

    The other figure from the original post is 93ms.

    Each of these are 15/16ms apart. It seems that windows has some sort of time-slots it is using, does anyone know about that? - it can't be just coincidence that we are always hitting these numbers?

  • I don't think this is necessarily a Windows issue, vs. an issue with Qt's implementation of their timer. Windows is apparently capable of high resolution timers - more info here

Log in to reply