Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
  • Search
  • Get Qt Extensions
  • Unsolved
Collapse
Brand Logo
  1. Home
  2. Qt Development
  3. General and Desktop
  4. [Solved] Design Advice : Download in parts and merge.
Qt 6.11 is out! See what's new in the release blog

[Solved] Design Advice : Download in parts and merge.

Scheduled Pinned Locked Moved General and Desktop
6 Posts 2 Posters 1.4k Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • A Offline
    A Offline
    ashishbansal
    wrote on last edited by
    #1

    Hey I want to download one file in parts and then merge it. I can download file in parts by sending range header in my request and then write it sequentially to get the full file.
    I am unable to decide good design technique and what I have in mind, I think that is very bad design.
    Here's what I have thought :
    Suppose I have file of size 1200bytes.
    I will make 6 QNetworkRequests with headers as ranges of 0-199, 200-399 and so on. Then I will send "get" request using one QNetworkAccessManger and connect the signal finish to downloadFinished function.
    There will be 6 QNetworkReplys corresponding to different requests.
    I have set a variable counter which I will increment in the downloadFinished function.
    If the value of that counter is less than 6, then return.
    Else I will write the data from 6 QNetworkReplys into the file in the specific sequence.

    Something like this
    @
    for(int i=0; i<6; i++){
    replyRequest[i] = qnam->get(*req[i]);
    }
    connect(qnam, &QNetworkAccessManager::finished, this, &Download::downloadFinished);

    void Download::downloadFinished(QNetworkReply *reply)
    {
    counter++;
    if(counter < 6){
    return;
    }
    for(int i=0; i<6; i++){
    tempFile.write(replyRequest[i]->readAll());
    }
    //cleanup
    }
    @

    Any better solution??

    1 Reply Last reply
    0
    • p3c0P Offline
      p3c0P Offline
      p3c0
      Moderators
      wrote on last edited by
      #2

      Hi,

      Any specific reason you are using get in for loop ? Remember QNetworkAccessManager has an asynchronous API and hence it wont wait for current request to finish.

      157

      1 Reply Last reply
      0
      • A Offline
        A Offline
        ashishbansal
        wrote on last edited by
        #3

        [quote author="p3c0" date="1419941355"]Hi,

        Any specific reason you are using get in for loop ? Remember QNetworkAccessManager has an asynchronous API and hence it wont wait for current request to finish. [/quote]

        Hey, Yeah. In the API, it has been described that QNetworkAccessManager queues all the requests it gets and can process 6 at max requests in parallel.
        So, What I am trying to do it get the file in parts, so that I can fetch out file at maximum speed as possible from the server. That's why I am sending 6 get requests in the loop.

        1 Reply Last reply
        0
        • p3c0P Offline
          p3c0P Offline
          p3c0
          Moderators
          wrote on last edited by
          #4

          But the bandwidth will get divided between all the requests I guess. Also what if one of the request fails ? Personally, I would send each requests sequentially and that too if previous one succeeds. But these are my thoughts as I have never tried the approach that you have followed.

          157

          1 Reply Last reply
          0
          • A Offline
            A Offline
            ashishbansal
            wrote on last edited by
            #5

            [quote author="p3c0" date="1420001874"]But the bandwidth will get divided between all the requests I guess.[/quote]
            Many of servers are configured to limit the speed to one connection. So for example, If ISP is providing me 1MBps and server has limited the speed to 200KBps and If I would make 5 connections, then I will be able to get that file at maximum bandwidth.
            [quote]Also what if one of the request fails ?[/quote]
            Well I didn't thought of this. But I think I can serve the problem. I would keep writing data continuously into the QTemporaryFile, in the function QNetworkReply::downloadProgress.
            So, if any request fails, I would be sending that request again but with changed HTTP Range Header i.e. End would be the same but start would be the size of QTemporaryFile + 1. So this way I think would be able to resume that request from where it failed.[quote]
            Personally, I would send each requests sequentially and that too if previous one succeeds. But these are my thoughts as I have never tried the approach that you have followed.[/quote]
            Yeah but to get the maximum speed, I have to do this. That's the same thing I guess, other good Download Managers do.

            1 Reply Last reply
            0
            • p3c0P Offline
              p3c0P Offline
              p3c0
              Moderators
              wrote on last edited by
              #6

              Ok. Good to go then. May be someone else have other opinions.

              157

              1 Reply Last reply
              0

              • Login

              • Login or register to search.
              • First post
                Last post
              0
              • Categories
              • Recent
              • Tags
              • Popular
              • Users
              • Groups
              • Search
              • Get Qt Extensions
              • Unsolved