Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
  • Search
  • Get Qt Extensions
  • Unsolved
Collapse
Brand Logo
  1. Home
  2. Qt Development
  3. General and Desktop
  4. QTime and QTimer resolution

QTime and QTimer resolution

Scheduled Pinned Locked Moved General and Desktop
qtimerqtime
17 Posts 5 Posters 14.6k Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • A Offline
    A Offline
    alex_malyu
    wrote on last edited by
    #4

    try to call QCoreApplication::processEvents after you start timer

    siropS 1 Reply Last reply
    0
    • A alex_malyu

      try to call QCoreApplication::processEvents after you start timer

      siropS Offline
      siropS Offline
      sirop
      wrote on last edited by
      #5

      @alex_malyu

      @alex_malyu said:

      try to call QCoreApplication::processEvents after you start timer

      What shall I expect from QCoreApplication::processEvents ?
      Shall I do something like:
      QCoreApplication::processEvents(QEventLoop::ExcludeUserInputEvents)

      I am going to exclude as much user input as possible
      during plotting anyway.

      To be, or not to be: that is the question:
      Whether ’tis nobler in the mind to suffer
      The slings and arrows of outrageous fortune,
      Or to take arms against a sea of troubles,
      And by opposing end them?

      1 Reply Last reply
      0
      • A Offline
        A Offline
        alex_malyu
        wrote on last edited by
        #6

        @sirop said:

        What shall I expect from QCoreApplication::processEvents

        QTimer ( at least in Qt 4) requires event loop to run.
        If you create multiple timers in a loop they may not be started until your function returns.
        QCoreApplication::processEvents may actually force event loop to start and that means you will see different starting time.
        Above advice depends on some factors in the code you did not show and some assumptions I could not sure were right.

        So I suggested you to try. Have you tried? Has it changed the picture?

        siropS 1 Reply Last reply
        0
        • A alex_malyu

          @sirop said:

          What shall I expect from QCoreApplication::processEvents

          QTimer ( at least in Qt 4) requires event loop to run.
          If you create multiple timers in a loop they may not be started until your function returns.
          QCoreApplication::processEvents may actually force event loop to start and that means you will see different starting time.
          Above advice depends on some factors in the code you did not show and some assumptions I could not sure were right.

          So I suggested you to try. Have you tried? Has it changed the picture?

          siropS Offline
          siropS Offline
          sirop
          wrote on last edited by
          #7

          @alex_malyu

          I beg your pardon if my words appeared to be rude.

          I just meant that I disabled as many subwidgets (pushbuttons, interactive plot features)
          for the duration of plotting hoping that it would take the load off the main event loop.
          Would it?

          As I see it, your suggestion goes into the same direction if used with
          QEventLoop::ExcludeUserInputEvents flag.
          I have not tried your suggestion out yet.

          I have only one QTimer object but I want it to be as precise as possible.

          To be, or not to be: that is the question:
          Whether ’tis nobler in the mind to suffer
          The slings and arrows of outrageous fortune,
          Or to take arms against a sea of troubles,
          And by opposing end them?

          1 Reply Last reply
          0
          • JKSHJ Offline
            JKSHJ Offline
            JKSH
            Moderators
            wrote on last edited by
            #8

            @alex_malyu said:

            QCoreApplication::processEvents may actually force event loop to start

            No, it will not.

            When you call processEvents() once, it will process the event queue once. That's all. It will not start or stop the event loop.

            @sirop said:

            I have to use now an at least 10 years old Windows XP Embedded with 512 MB RAM.

            Which service pack?

            I have only one QTimer object but I want it to be as precise as possible.

            "As precise as possible" is limited by your hardware and OS. I don't know what your board and Windows XP Embedded are capable of, but there's a chance it can't handle 1000 Hz precisely (especially not while it's painting a graph)

            To test this, try starting your timer with 500 ms intervals. Does your output look correct? After that, gradually reduce the intervals (100 ms, 50 ms, 16 ms, 10 ms, 5 ms, 1 ms). What does your output look like with each interval?

            I'll get in the worst case no more than 200 new values pro second,
            but these values are delivered into a shared data structure by an asynchronous function,
            which I monitor synchronously with my QTimer ( 1ms interval) in order not to miss any new data.

            If your data updates no faster than 200 Hz, then checking at 400 Hz already guarantees that you'll never miss any data. 1000 Hz is overkill.

                QObject::connect(timer, &QTimer::timeout, [&]() {
                    int tempSize=DDEComm::instance()->ddevector->size();
                    FILE* fp=fopen(QString(qApp>applicationDirPath()+"\\DEBUG_AAPLOT.txt").toLocal8Bit().data(), "a");
            

            Note that opening, writing, and closing a file takes several milliseconds.

            Qt Doc Search for browsers: forum.qt.io/topic/35616/web-browser-extension-for-improved-doc-searches

            siropS 1 Reply Last reply
            0
            • JKSHJ JKSH

              @alex_malyu said:

              QCoreApplication::processEvents may actually force event loop to start

              No, it will not.

              When you call processEvents() once, it will process the event queue once. That's all. It will not start or stop the event loop.

              @sirop said:

              I have to use now an at least 10 years old Windows XP Embedded with 512 MB RAM.

              Which service pack?

              I have only one QTimer object but I want it to be as precise as possible.

              "As precise as possible" is limited by your hardware and OS. I don't know what your board and Windows XP Embedded are capable of, but there's a chance it can't handle 1000 Hz precisely (especially not while it's painting a graph)

              To test this, try starting your timer with 500 ms intervals. Does your output look correct? After that, gradually reduce the intervals (100 ms, 50 ms, 16 ms, 10 ms, 5 ms, 1 ms). What does your output look like with each interval?

              I'll get in the worst case no more than 200 new values pro second,
              but these values are delivered into a shared data structure by an asynchronous function,
              which I monitor synchronously with my QTimer ( 1ms interval) in order not to miss any new data.

              If your data updates no faster than 200 Hz, then checking at 400 Hz already guarantees that you'll never miss any data. 1000 Hz is overkill.

                  QObject::connect(timer, &QTimer::timeout, [&]() {
                      int tempSize=DDEComm::instance()->ddevector->size();
                      FILE* fp=fopen(QString(qApp>applicationDirPath()+"\\DEBUG_AAPLOT.txt").toLocal8Bit().data(), "a");
              

              Note that opening, writing, and closing a file takes several milliseconds.

              siropS Offline
              siropS Offline
              sirop
              wrote on last edited by sirop
              #9

              @JKSH

              @JKSH said:

              @alex_malyu said:

              QCoreApplication::processEvents may actually force event loop to start

              No, it will not.

              When you call processEvents() once, it will process the event queue once. That's all. It will not start or stop the event loop.

              @sirop said:

              I have to use now an at least 10 years old Windows XP Embedded with 512 MB RAM.

              Which service pack?

              It seems to be SP3.

              I have only one QTimer object but I want it to be as precise as possible.

              "As precise as possible" is limited by your hardware and OS. I don't know what your board and Windows XP Embedded are capable of, but there's a chance it can't handle 1000 Hz precisely (especially not while it's painting a graph)

              To test this, try starting your timer with 500 ms intervals. Does your output look correct? After that, gradually reduce the intervals (100 ms, 50 ms, 16 ms, 10 ms, 5 ms, 1 ms). What does your output look like with each interval?

              My output -- if you mean my almost "real-time" plot -- can work with QTimer set from 1ms to 5ms interval, and a naive user would notice no delay,
              although the timer is not shot precisely at 1 ms interval, maybe, also due to my debugging to hard disk too much as you pointed out below.

              I'll get in the worst case no more than 200 new values pro second,
              but these values are delivered into a shared data structure by an asynchronous function,
              which I monitor synchronously with my QTimer ( 1ms interval) in order not to miss any new data.

              If your data updates no faster than 200 Hz, then checking at 400 Hz already guarantees that you'll never miss any data. 1000 Hz is overkill.

              I do see that this is an overkill, but as my data source is asynchronous, I thought to put up with
              this overkill because sometimes my application has to react to the data change in a real time manner.

              I had already a more elegant asynchronous SIGNAL/SLOT solution:
              http://forum.qt.io/topic/52824/solved-monitoring-an-object-changed-asynchronously-by-a-callback-function/7

              but it was only fast enough for simulated data change at 1 Hz.

              To be, or not to be: that is the question:
              Whether ’tis nobler in the mind to suffer
              The slings and arrows of outrageous fortune,
              Or to take arms against a sea of troubles,
              And by opposing end them?

              JKSHJ 1 Reply Last reply
              0
              • siropS sirop

                @sierdzio said:

                You are aware that updating the plot at 1ms intervals means 1000 Hz refresh rate, while your display is capable of 60Hz as a maximum? Updating so often seems wasteful.

                Yes, I am aware of this. I'll get in the worst case no more than 200 new values pro second,
                but these values are delivered into a shared data structure by an asynchronous function,
                which I monitor synchronously with my QTimer ( 1ms interval) in order not to miss any new data.
                All these has not been a problem so far, and in the worst case I can downsample the new data
                within my updatePlot() SLOT as updatePlot() first does some simple evaluation and only then plots.

                Also, please be aware of this note in the documentation of QTime::elapsed():

                "Note that the accuracy depends on the accuracy of the underlying operating system; not all systems provide 1-millisecond accuracy."
                

                I have to use now an at least 10 years old Windows XP Embedded with 512 MB RAM.
                I compile on WIn 7 with mingw-4.9.2 32 bit Dwarf-2.

                Maybe try with QElapsedTimer?

                Can you try using a lambda function instead of updatePlot() method? I am curious as to what the result would be there (there should be no difference), and also try connecting to updatePlot() using Qt::QueuedConnection (here it should work worse).

                Yes, I tried QElapsedTimer with and without a lambda function. You are right: there is no difference. I did not try out Qt::QueuedConnection as I was too lazy for this, but if you are very curious, then I will do even that ;) .

                The pattern of the debug output in both cases -- with or without lambda function -- looks like that:

                TIME ELAPSED(QTimer)= 0, TIME ELAPSED(QElapsedTimer)= 0, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=260374400
                TIME ELAPSED(QTimer)= 0, TIME ELAPSED(QElapsedTimer)= 1, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=260374400
                TIME ELAPSED(QTimer)= 0, TIME ELAPSED(QElapsedTimer)= 2, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=260374400
                TIME ELAPSED(QTimer)= 0, TIME ELAPSED(QElapsedTimer)= 3, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=260374400
                TIME ELAPSED(QTimer)= 0, TIME ELAPSED(QElapsedTimer)= 4, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=260374400
                TIME ELAPSED(QTimer)= 0, TIME ELAPSED(QElapsedTimer)= 5, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=260374400
                TIME ELAPSED(QTimer)= 10, TIME ELAPSED(QElapsedTimer)= 6, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=260374400
                TIME ELAPSED(QTimer)= 10, TIME ELAPSED(QElapsedTimer)= 7, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=260374400
                TIME ELAPSED(QTimer)= 10, TIME ELAPSED(QElapsedTimer)= 8, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=260374400
                TIME ELAPSED(QTimer)= 10, TIME ELAPSED(QElapsedTimer)= 9, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=260374400
                TIME ELAPSED(QTimer)= 10, TIME ELAPSED(QElapsedTimer)= 10, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=260374400
                TIME ELAPSED(QTimer)= 10, TIME ELAPSED(QElapsedTimer)= 11, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=260374400
                

                So if I look only at TIME ELAPSED(QElapsedTimer) values, I might think that
                the time resolution for QTimer is very accurate. But if I look at TIME ELAPSED(QTimer) values,
                then I am no longer sure that QTimer is very accurate, although I used Qt::PreciseTimer.
                If I look at TIMESPEC Nanosec, I get even more confused...

                Would you still advise to use a lambda function for some other reason although it makes no difference here?

                What I think might be happening is that your frequent calls to updatePlot() get queued up, and then fired off in quick succession, which could explain "time elapsed = 0". It's still a bit weird, though.

                Yes, this also my suspicion.

                My code with QElapsedTimer looks like that:

                     timer = new MyTimer();
                      timer->setTimerType(Qt::PreciseTimer);
                     time=new QTime();
                     timeElapsed = new QElapsedTimer();
                     
                     QObject::connect(timer, &QTimer::timeout, [&]() {
                         int tempSize=DDEComm::instance()->ddevector->size();
                         FILE* fp=fopen(QString(qApp>applicationDirPath()+"\\DEBUG_AAPLOT.txt").toLocal8Bit().data(), "a");
                          struct timespec call_data;
                          clock_gettime(CLOCK_PROCESS_CPUTIME_ID, &call_data);
                
                         int te=time->elapsed();
                         int tete=timeElapsed->elapsed();
                         fprintf(fp, "TIME ELAPSED(QTimer)= %d, TIME ELAPSED(QElapsedTimer)= %d, TIMESPEC Sec=%s, TIMESPEC Nanosec=%lu\n", te, tete, ctime(&call_data.tv_sec),call_data.tv_nsec );
                        fclose(fp);
                        // Some other stuff
                     });
                

                Thanks for your answer anyway.

                B Offline
                B Offline
                belab
                wrote on last edited by belab
                #10

                Hi,
                have you tried timeElapsed ->nsecsElapsed()? I guess if the interval is below 1 ms, timeElapsed->elapsed() will always return 0. On a windows 7 machine I'm getting most of the time intervals below 1 ms. Here's some debug output:
                Time since last update 987192 nanoseconds
                Time since last update 980478 nanoseconds
                Time since last update 984674 nanoseconds
                Time since last update 987751 nanoseconds
                Time since last update 981038 nanoseconds
                Time since last update 1013207 nanoseconds
                Time since last update 955861 nanoseconds
                Time since last update 983835 nanoseconds
                Time since last update 1007613 nanoseconds
                Time since last update 955581 nanoseconds
                It seems like "Precise timers will also never time out earlier than expected." is wrong in this case.

                siropS 1 Reply Last reply
                0
                • siropS sirop

                  @JKSH

                  @JKSH said:

                  @alex_malyu said:

                  QCoreApplication::processEvents may actually force event loop to start

                  No, it will not.

                  When you call processEvents() once, it will process the event queue once. That's all. It will not start or stop the event loop.

                  @sirop said:

                  I have to use now an at least 10 years old Windows XP Embedded with 512 MB RAM.

                  Which service pack?

                  It seems to be SP3.

                  I have only one QTimer object but I want it to be as precise as possible.

                  "As precise as possible" is limited by your hardware and OS. I don't know what your board and Windows XP Embedded are capable of, but there's a chance it can't handle 1000 Hz precisely (especially not while it's painting a graph)

                  To test this, try starting your timer with 500 ms intervals. Does your output look correct? After that, gradually reduce the intervals (100 ms, 50 ms, 16 ms, 10 ms, 5 ms, 1 ms). What does your output look like with each interval?

                  My output -- if you mean my almost "real-time" plot -- can work with QTimer set from 1ms to 5ms interval, and a naive user would notice no delay,
                  although the timer is not shot precisely at 1 ms interval, maybe, also due to my debugging to hard disk too much as you pointed out below.

                  I'll get in the worst case no more than 200 new values pro second,
                  but these values are delivered into a shared data structure by an asynchronous function,
                  which I monitor synchronously with my QTimer ( 1ms interval) in order not to miss any new data.

                  If your data updates no faster than 200 Hz, then checking at 400 Hz already guarantees that you'll never miss any data. 1000 Hz is overkill.

                  I do see that this is an overkill, but as my data source is asynchronous, I thought to put up with
                  this overkill because sometimes my application has to react to the data change in a real time manner.

                  I had already a more elegant asynchronous SIGNAL/SLOT solution:
                  http://forum.qt.io/topic/52824/solved-monitoring-an-object-changed-asynchronously-by-a-callback-function/7

                  but it was only fast enough for simulated data change at 1 Hz.

                  JKSHJ Offline
                  JKSHJ Offline
                  JKSH
                  Moderators
                  wrote on last edited by
                  #11

                  @sirop said:

                  My output -- if you mean my almost "real-time" plot...

                  I meant your debug output. What do the time measurements look like when you use different timer intervals?

                  Qt Doc Search for browsers: forum.qt.io/topic/35616/web-browser-extension-for-improved-doc-searches

                  siropS B 2 Replies Last reply
                  0
                  • JKSHJ JKSH

                    @sirop said:

                    My output -- if you mean my almost "real-time" plot...

                    I meant your debug output. What do the time measurements look like when you use different timer intervals?

                    siropS Offline
                    siropS Offline
                    sirop
                    wrote on last edited by
                    #12

                    @JKSH said:

                    I meant your debug output. What do the time measurements look like when you use different timer intervals?

                    I can not run the risk of applying a 500 ms interval when
                    working with real data source, but I tried 4 ms which is four times bigger than 1 ms.
                    The pattern of the debug output looks like this:

                    TIME ELAPSED(QTimer)= 280, TIME ELAPSED(QElapsedTimer)= 276, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=200288000
                    TIME ELAPSED(QTimer)= 280, TIME ELAPSED(QElapsedTimer)= 280, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=200288000
                    TIME ELAPSED(QTimer)= 290, TIME ELAPSED(QElapsedTimer)= 284, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=200288000
                    TIME ELAPSED(QTimer)= 290, TIME ELAPSED(QElapsedTimer)= 288, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=200288000
                    TIME ELAPSED(QTimer)= 300, TIME ELAPSED(QElapsedTimer)= 292, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=200288000
                    TIME ELAPSED(QTimer)= 300, TIME ELAPSED(QElapsedTimer)= 296, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=200288000
                    TIME ELAPSED(QTimer)= 300, TIME ELAPSED(QElapsedTimer)= 300, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=200288000
                    TIME ELAPSED(QTimer)= 310, TIME ELAPSED(QElapsedTimer)= 304, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=200288000
                    TIME ELAPSED(QTimer)= 310, TIME ELAPSED(QElapsedTimer)= 308, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=200288000
                    TIME ELAPSED(QTimer)= 320, TIME ELAPSED(QElapsedTimer)= 312, TIMESPEC Sec=Thu Jan 01 01:00:00 1970, TIMESPEC Nanosec=200288000
                    

                    To be, or not to be: that is the question:
                    Whether ’tis nobler in the mind to suffer
                    The slings and arrows of outrageous fortune,
                    Or to take arms against a sea of troubles,
                    And by opposing end them?

                    1 Reply Last reply
                    0
                    • B belab

                      Hi,
                      have you tried timeElapsed ->nsecsElapsed()? I guess if the interval is below 1 ms, timeElapsed->elapsed() will always return 0. On a windows 7 machine I'm getting most of the time intervals below 1 ms. Here's some debug output:
                      Time since last update 987192 nanoseconds
                      Time since last update 980478 nanoseconds
                      Time since last update 984674 nanoseconds
                      Time since last update 987751 nanoseconds
                      Time since last update 981038 nanoseconds
                      Time since last update 1013207 nanoseconds
                      Time since last update 955861 nanoseconds
                      Time since last update 983835 nanoseconds
                      Time since last update 1007613 nanoseconds
                      Time since last update 955581 nanoseconds
                      It seems like "Precise timers will also never time out earlier than expected." is wrong in this case.

                      siropS Offline
                      siropS Offline
                      sirop
                      wrote on last edited by sirop
                      #13

                      @belab said:

                      It seems like "Precise timers will also never time out earlier than expected." is wrong in this case.

                      "On platforms that do not provide nanosecond resolution, the value returned will be the best estimate available." http://doc.qt.io/qt-5/qelapsedtimer.html#nsecsElapsed

                      Just checked time resolution on my win 7 at home:

                      C:\Users\User\Downloads\ClockRes>Clockres.exe
                      
                      ClockRes v2.0 - View the system clock resolution
                      Copyright (C) 2009 Mark Russinovich
                      SysInternals - www.sysinternals.com
                      
                      Maximum timer interval: 15.600 ms
                      Minimum timer interval: 0.500 ms
                      Current timer interval: 15.600 ms
                      

                      To be, or not to be: that is the question:
                      Whether ’tis nobler in the mind to suffer
                      The slings and arrows of outrageous fortune,
                      Or to take arms against a sea of troubles,
                      And by opposing end them?

                      1 Reply Last reply
                      0
                      • JKSHJ JKSH

                        @sirop said:

                        My output -- if you mean my almost "real-time" plot...

                        I meant your debug output. What do the time measurements look like when you use different timer intervals?

                        B Offline
                        B Offline
                        belab
                        wrote on last edited by belab
                        #14

                        @JKSH For example with an interval of 5 ms or 50 ms I get nearly the same devation about up to +- 40 microseconds.
                        @sirop I assume that clockres just accesses the system clock and not the high resolution time.

                         Copyright (C) 2009 Mark Russinovich
                         SysInternals - www.sysinternals.com
                        
                         Maximum timer interval: 15.600 ms
                         Minimum timer interval: 0.500 ms
                         Current timer interval: 1.000 ms
                        
                        siropS 2 Replies Last reply
                        0
                        • B belab

                          @JKSH For example with an interval of 5 ms or 50 ms I get nearly the same devation about up to +- 40 microseconds.
                          @sirop I assume that clockres just accesses the system clock and not the high resolution time.

                           Copyright (C) 2009 Mark Russinovich
                           SysInternals - www.sysinternals.com
                          
                           Maximum timer interval: 15.600 ms
                           Minimum timer interval: 0.500 ms
                           Current timer interval: 1.000 ms
                          
                          siropS Offline
                          siropS Offline
                          sirop
                          wrote on last edited by
                          #15

                          @belab said:

                          @JKSH For example with an interval of 5 ms or 50 ms I get nearly the same devation about up to +- 40 microseconds.
                          @sirop I assume that clockres just accesses the system clock and not the high resolution time.

                           Copyright (C) 2009 Mark Russinovich
                           SysInternals - www.sysinternals.com
                          
                           Maximum timer interval: 15.600 ms
                           Minimum timer interval: 0.500 ms
                           Current timer interval: 1.000 ms
                          

                          Tell me please if you get different results with QElapsedTimer::PerformanceCounter set.

                          To be, or not to be: that is the question:
                          Whether ’tis nobler in the mind to suffer
                          The slings and arrows of outrageous fortune,
                          Or to take arms against a sea of troubles,
                          And by opposing end them?

                          B 1 Reply Last reply
                          0
                          • B belab

                            @JKSH For example with an interval of 5 ms or 50 ms I get nearly the same devation about up to +- 40 microseconds.
                            @sirop I assume that clockres just accesses the system clock and not the high resolution time.

                             Copyright (C) 2009 Mark Russinovich
                             SysInternals - www.sysinternals.com
                            
                             Maximum timer interval: 15.600 ms
                             Minimum timer interval: 0.500 ms
                             Current timer interval: 1.000 ms
                            
                            siropS Offline
                            siropS Offline
                            sirop
                            wrote on last edited by sirop
                            #16

                            @belab
                            And may be try this to set your time resolution to 0.500 ms:
                            http://webcache.googleusercontent.com/search?q=cache:kq0MMcXg5JMJ:undocumented.ntinternals.net/source/usermode/undocumented%2520functions/time/ntsettimerresolution.html+&cd=1&hl=en&ct=clnk&gl=de&client=opera

                            https://github.com/abort/W32ResTimer/blob/master/main.c

                            To be, or not to be: that is the question:
                            Whether ’tis nobler in the mind to suffer
                            The slings and arrows of outrageous fortune,
                            Or to take arms against a sea of troubles,
                            And by opposing end them?

                            1 Reply Last reply
                            0
                            • siropS sirop

                              @belab said:

                              @JKSH For example with an interval of 5 ms or 50 ms I get nearly the same devation about up to +- 40 microseconds.
                              @sirop I assume that clockres just accesses the system clock and not the high resolution time.

                               Copyright (C) 2009 Mark Russinovich
                               SysInternals - www.sysinternals.com
                              
                               Maximum timer interval: 15.600 ms
                               Minimum timer interval: 0.500 ms
                               Current timer interval: 1.000 ms
                              

                              Tell me please if you get different results with QElapsedTimer::PerformanceCounter set.

                              B Offline
                              B Offline
                              belab
                              wrote on last edited by
                              #17

                              @sirop It's set, I've checked it like this:

                              void MainWindow::timerUpdate() {
                                  static int start = 0;
                                  static const size_t loops = 10000;
                                  static qint64 nsecs[loops];
                                  static size_t index = 0;
                                  static QElapsedTimer* elapsed  = new QElapsedTimer();
                                  if(start++ < 10) {
                                      elapsed->restart();
                                      qDebug() << "clockType: " << elapsed->clockType();
                                      return;
                                  }
                                  if(index<loops) {
                                      nsecs[index++]=elapsed->nsecsElapsed();
                                      elapsed->restart();
                                      return;
                                  }
                                  timer->stop();
                                  qint64 maxDev = 0;
                                  qint64 avgDev = 0;
                                  for( qint64 elapsedTime : nsecs ) {
                                      //qDebug() << "Time since last update" << elapsedTime << "nanoseconds";
                                      qint64 dev = 5000000 - elapsedTime;
                                      if(dev < 0) {
                                          dev = dev*-1;
                                      }
                                      avgDev += dev;
                                      if( dev > maxDev)
                                          maxDev = dev;
                                  }
                                  qDebug() << "Max deviation: " << maxDev;
                                  qDebug() << "Avg deviation: " << avgDev/loops;
                              }
                              

                              With following output:

                              clockType:  4
                              Max deviation:  4615989
                              Avg deviation:  4327
                              
                              1 Reply Last reply
                              0

                              • Login

                              • Login or register to search.
                              • First post
                                Last post
                              0
                              • Categories
                              • Recent
                              • Tags
                              • Popular
                              • Users
                              • Groups
                              • Search
                              • Get Qt Extensions
                              • Unsolved