Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
  • Search
  • Get Qt Extensions
  • Unsolved
Collapse
Brand Logo
  1. Home
  2. Qt Development
  3. General and Desktop
  4. Application crashing while being run on Debug, works on Release

Application crashing while being run on Debug, works on Release

Scheduled Pinned Locked Moved Solved General and Desktop
6 Posts 3 Posters 3.5k Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • D Offline
    D Offline
    darkp03
    wrote on last edited by
    #1

    Hi everyone

    I have an application that if I run it on Release mode, through QT Creator, runs with no problems
    If i run the application directly from the executable (found on release folder), it runs with no problems.

    If i want to Debug my application, it will crash after a couple of seconds giving the following error code.

    ASSERT: "uint(i) < uint(size())" in file .. \ .. \ ..\ .. \Qt\Qt5.3.2\5.3\mingw482_32\include/QtCore/qbytearray.h, line 432
    Invalid parameter passed to C runtime function.
    Invalid parameter passed to C runtime function.

    When I click the error code, im not taken to a place of My code where the app crashed, but im taken to the qbytearray.h file, to the following line

    inline char QByteArray::at(int i) const
    {Q_ASSERT(uint(i) < uint(size())); return d->data()[i];}
    

    I did my best to locate the exact point where my program crashes, but as the application has 4 different threads running at the same time, I wasnt able to pinpoint where it crashes, as it crashes really fast, and i havent been able to find it via breakpoints.

    What does this error mean?
    If it worked on Debug, but didnt work on Release, at lease would make more sense. Debug is supposed to be able to handle some mistakes and protect your code.
    But here I have the other scenario. On release it works perfectly, but im not able to debug it.

    The structure of the program is:

    Main thread: Recieves data from QUdpSocket, sends it to processing thread via QQueue and QMutex. It also handles a couple of SQL connections. Works at QThread::TimeCriticalPriority. Only 1 instance

    Processing Thread: Works with the Packets from the Queue. All the process is made by C functions (Not using any weird class). Sends the result to the sender Thread, via QQueue and QMutex. Works at QThread::HighestPriority. Many instances, depending on how many Services are online.

    Sender Thread: Recieves packets from a QQueue, and sends them via QUdpSocket to different addresess and ports. Works at QThread::HighPriority. Only one instance

    Logger Thread: Recieves QStrings from any of the other Threads. Writes them in a .txt file for logging purposes. Works at QThread::IdlePriority. Only one instance

    Any idea or suggestions?
    Thanks in advance

    matthew.kuiashM 1 Reply Last reply
    0
    • D darkp03

      Hi everyone

      I have an application that if I run it on Release mode, through QT Creator, runs with no problems
      If i run the application directly from the executable (found on release folder), it runs with no problems.

      If i want to Debug my application, it will crash after a couple of seconds giving the following error code.

      ASSERT: "uint(i) < uint(size())" in file .. \ .. \ ..\ .. \Qt\Qt5.3.2\5.3\mingw482_32\include/QtCore/qbytearray.h, line 432
      Invalid parameter passed to C runtime function.
      Invalid parameter passed to C runtime function.

      When I click the error code, im not taken to a place of My code where the app crashed, but im taken to the qbytearray.h file, to the following line

      inline char QByteArray::at(int i) const
      {Q_ASSERT(uint(i) < uint(size())); return d->data()[i];}
      

      I did my best to locate the exact point where my program crashes, but as the application has 4 different threads running at the same time, I wasnt able to pinpoint where it crashes, as it crashes really fast, and i havent been able to find it via breakpoints.

      What does this error mean?
      If it worked on Debug, but didnt work on Release, at lease would make more sense. Debug is supposed to be able to handle some mistakes and protect your code.
      But here I have the other scenario. On release it works perfectly, but im not able to debug it.

      The structure of the program is:

      Main thread: Recieves data from QUdpSocket, sends it to processing thread via QQueue and QMutex. It also handles a couple of SQL connections. Works at QThread::TimeCriticalPriority. Only 1 instance

      Processing Thread: Works with the Packets from the Queue. All the process is made by C functions (Not using any weird class). Sends the result to the sender Thread, via QQueue and QMutex. Works at QThread::HighestPriority. Many instances, depending on how many Services are online.

      Sender Thread: Recieves packets from a QQueue, and sends them via QUdpSocket to different addresess and ports. Works at QThread::HighPriority. Only one instance

      Logger Thread: Recieves QStrings from any of the other Threads. Writes them in a .txt file for logging purposes. Works at QThread::IdlePriority. Only one instance

      Any idea or suggestions?
      Thanks in advance

      matthew.kuiashM Offline
      matthew.kuiashM Offline
      matthew.kuiash
      wrote on last edited by matthew.kuiash
      #2

      @darkp03 My rough guess is that you've walked off the start or end of a byte array.

      Debug builds detect this and fail. Release builds just crap all over memory until you get a seg fault or similar (that is read/write to invalid memory). Because of the granularity of MMU memory protection release builds will not always crap out. Protection is usually on 4K boundaries so if you walk off the end of a 3000 byte long array (that is page aligned) your app will probably carry on regardless even though it's done something bad.

      Walking the stack back from the crash point should give you some idea of what caused the problem. Another option is the use Valgrind or similar.

      The legendary cellist Pablo Casals was asked why he continued to practice at age 90. "Because I think I'm making progress," he replied.

      1 Reply Last reply
      1
      • D Offline
        D Offline
        darkp03
        wrote on last edited by
        #3

        Wow,

        I thought that Windows protection was actually better than the protection of Qt itself. Apparently it isnt. Thanks for the tip, i will re-check my code

        matthew.kuiashM 1 Reply Last reply
        0
        • D darkp03

          Wow,

          I thought that Windows protection was actually better than the protection of Qt itself. Apparently it isnt. Thanks for the tip, i will re-check my code

          matthew.kuiashM Offline
          matthew.kuiashM Offline
          matthew.kuiash
          wrote on last edited by
          #4

          @darkp03 It isn't OS specific. It's the limits of what can be detected by the CPU it's paging and MMU systems.

          Memory will be served in multiples of pages, typically 4K. I'm not aware of any more accurate protection than that.

          Even then the application is free to read/write throughout any application allocated memory. There's no way the CPU can differentiate between object accesses.

          macOS, Windows, Linux on x86, ARM & MIPS all behave broadly the same way.

          The legendary cellist Pablo Casals was asked why he continued to practice at age 90. "Because I think I'm making progress," he replied.

          kshegunovK 1 Reply Last reply
          0
          • matthew.kuiashM matthew.kuiash

            @darkp03 It isn't OS specific. It's the limits of what can be detected by the CPU it's paging and MMU systems.

            Memory will be served in multiples of pages, typically 4K. I'm not aware of any more accurate protection than that.

            Even then the application is free to read/write throughout any application allocated memory. There's no way the CPU can differentiate between object accesses.

            macOS, Windows, Linux on x86, ARM & MIPS all behave broadly the same way.

            kshegunovK Offline
            kshegunovK Offline
            kshegunov
            Moderators
            wrote on last edited by
            #5

            @matthew.kuiash said in Application crashing while being run on Debug, works on Release:

            Memory will be served in multiples of pages, typically 4K. I'm not aware of any more accurate protection than that.

            This is only part of the story. Paging may or may not be employed. In the olden days they used to do segmentation (before paging become popular) to spare the addressing word size growing. Actually the memory corruption errors reflect this history. On window you get "Access violation" for going over the current page, while on Linux it's "segmentation fault" for overwriting the next segment. However in modern computers it may be that neither one is employed. I for one run my box without a swap file (linux) and hence I have only direct addressing, i.e. virtual memory addresses are the one and the same as physical memory addresses.

            Another layer of security is what you mentioned it's the execute data bit protection some CPUs might employ. This can be disabled from the BIOS though, so it's not a silver bullet too.

            @darkp03 said in Application crashing while being run on Debug, works on Release:

            If you're leaning on delegating responsibility for memory integrity in your application to the OS, you're on a slippery slope my friend. Some compilers (notably g++) will insert stack smashing dummies along buffer boundaries, but this is a "last resort" measure that can't substitute proper debugging. So do not ever ignore debug assertions (what you probably triggered), just fix access to the memory so you don't have it anymore.

            Kind regards.

            Read and abide by the Qt Code of Conduct

            matthew.kuiashM 1 Reply Last reply
            2
            • kshegunovK kshegunov

              @matthew.kuiash said in Application crashing while being run on Debug, works on Release:

              Memory will be served in multiples of pages, typically 4K. I'm not aware of any more accurate protection than that.

              This is only part of the story. Paging may or may not be employed. In the olden days they used to do segmentation (before paging become popular) to spare the addressing word size growing. Actually the memory corruption errors reflect this history. On window you get "Access violation" for going over the current page, while on Linux it's "segmentation fault" for overwriting the next segment. However in modern computers it may be that neither one is employed. I for one run my box without a swap file (linux) and hence I have only direct addressing, i.e. virtual memory addresses are the one and the same as physical memory addresses.

              Another layer of security is what you mentioned it's the execute data bit protection some CPUs might employ. This can be disabled from the BIOS though, so it's not a silver bullet too.

              @darkp03 said in Application crashing while being run on Debug, works on Release:

              If you're leaning on delegating responsibility for memory integrity in your application to the OS, you're on a slippery slope my friend. Some compilers (notably g++) will insert stack smashing dummies along buffer boundaries, but this is a "last resort" measure that can't substitute proper debugging. So do not ever ignore debug assertions (what you probably triggered), just fix access to the memory so you don't have it anymore.

              Kind regards.

              matthew.kuiashM Offline
              matthew.kuiashM Offline
              matthew.kuiash
              wrote on last edited by
              #6

              @kshegunov Ah yes... History, "the old days". I still have address aliasing, near, far and huge pointers, ram bank switching, EMM, XMS and HIMEM burnt into my mind for all the wrong reasons...

              The legendary cellist Pablo Casals was asked why he continued to practice at age 90. "Because I think I'm making progress," he replied.

              1 Reply Last reply
              0

              • Login

              • Login or register to search.
              • First post
                Last post
              0
              • Categories
              • Recent
              • Tags
              • Popular
              • Users
              • Groups
              • Search
              • Get Qt Extensions
              • Unsolved