Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
  • Search
  • Get Qt Extensions
  • Unsolved
Collapse
Brand Logo
  1. Home
  2. Qt Development
  3. General and Desktop
  4. QHttpServerResponder and large files
Qt 6.11 is out! See what's new in the release blog

QHttpServerResponder and large files

Scheduled Pinned Locked Moved Solved General and Desktop
2 Posts 1 Posters 130 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • J Offline
    J Offline
    JoseJX
    wrote on last edited by
    #1

    I'm having trouble coming up with a way to send large files through QHttpServerResponder and I was hoping someone here could help! What I'm seeing is that all of the suggested approaches appear to put the file into RAM. I'm trying to send a file that is larger than my available RAM so that's not a viable approach. My initial implementation was using something like:

    responder.writeBeginChunked(headers);
    while (!f.atEnd() && !f.error()) {
        QByteArray buf = f.read(65536);
        if (f.atEnd())
            responder.writeEndChunked(buf);
        else
            responder.writeChunk(buf);
    }
    

    This works, but it will buffer all of the chunks before sending anything, so I need to have enough RAM available for the whole file.

    QT Bug 135937 appears to be the same issue I'm running into, however, it was closed as an invalid bug. This forum post also has the same issue, but the suggestion to use QtWebApp (a non Qt library) isn't really a solution.

    In the bug, it is suggested to use the QIODevice variant of the QHttpServerResponder write method, but with a non-sequential QFile, it appears to just read the whole file into RAM anyway. A subclassed QIODevice that does the same but overrides the isSequential method to report that it is sequential appears to never send any data to the socket. I'm not sure why yet, but I'll update this if I can get this method to work.

    Does anyone have a suggestion for how else I could send large files with the QTHttpServerResponder API that doesn't require the file to be read into RAM first?

    Thanks!

    1 Reply Last reply
    0
    • J Offline
      J Offline
      JoseJX
      wrote on last edited by JoseJX
      #2

      Well, I figured it out shortly after posting this. Basically, creating a QIODevice that pretends to be sequential is the right answer. I overrode the following methods:
      open, close, isSequential, bytesAvalailable, atEnd (although it doesn't appear that this is used) and readData. When the device is opened, I open the file and set up the available bytes. I added another method to allow me to set a partial range which modified the available bytes. Once I am ready to read data from the file, I emit the readReady signal. The readData function returns bytes as needed until the end of the file. When the file is read from and there are no bytes remaining (EOF), I emit the signal readChannelFinished. The method isSequential just returns true so that we don't end up buffering the whole file. I also had to ensure that the lifetime of this object was long enough for the QHttpServerResponder to complete the read, so make sure it's not allocated on the stack.

      Hope that helps anyone else with the same problem!

      1 Reply Last reply
      3
      • J JoseJX has marked this topic as solved on

      • Login

      • Login or register to search.
      • First post
        Last post
      0
      • Categories
      • Recent
      • Tags
      • Popular
      • Users
      • Groups
      • Search
      • Get Qt Extensions
      • Unsolved