[Solved] Design Advice : Download in parts and merge.
-
Hey I want to download one file in parts and then merge it. I can download file in parts by sending range header in my request and then write it sequentially to get the full file.
I am unable to decide good design technique and what I have in mind, I think that is very bad design.
Here's what I have thought :
Suppose I have file of size 1200bytes.
I will make 6 QNetworkRequests with headers as ranges of 0-199, 200-399 and so on. Then I will send "get" request using one QNetworkAccessManger and connect the signal finish to downloadFinished function.
There will be 6 QNetworkReplys corresponding to different requests.
I have set a variable counter which I will increment in the downloadFinished function.
If the value of that counter is less than 6, then return.
Else I will write the data from 6 QNetworkReplys into the file in the specific sequence.Something like this
@
for(int i=0; i<6; i++){
replyRequest[i] = qnam->get(*req[i]);
}
connect(qnam, &QNetworkAccessManager::finished, this, &Download::downloadFinished);void Download::downloadFinished(QNetworkReply *reply)
{
counter++;
if(counter < 6){
return;
}
for(int i=0; i<6; i++){
tempFile.write(replyRequest[i]->readAll());
}
//cleanup
}
@Any better solution??
-
Hi,
Any specific reason you are using get in for loop ? Remember QNetworkAccessManager has an asynchronous API and hence it wont wait for current request to finish.
-
[quote author="p3c0" date="1419941355"]Hi,
Any specific reason you are using get in for loop ? Remember QNetworkAccessManager has an asynchronous API and hence it wont wait for current request to finish. [/quote]
Hey, Yeah. In the API, it has been described that QNetworkAccessManager queues all the requests it gets and can process 6 at max requests in parallel.
So, What I am trying to do it get the file in parts, so that I can fetch out file at maximum speed as possible from the server. That's why I am sending 6 get requests in the loop. -
But the bandwidth will get divided between all the requests I guess. Also what if one of the request fails ? Personally, I would send each requests sequentially and that too if previous one succeeds. But these are my thoughts as I have never tried the approach that you have followed.
-
[quote author="p3c0" date="1420001874"]But the bandwidth will get divided between all the requests I guess.[/quote]
Many of servers are configured to limit the speed to one connection. So for example, If ISP is providing me 1MBps and server has limited the speed to 200KBps and If I would make 5 connections, then I will be able to get that file at maximum bandwidth.
[quote]Also what if one of the request fails ?[/quote]
Well I didn't thought of this. But I think I can serve the problem. I would keep writing data continuously into the QTemporaryFile, in the function QNetworkReply::downloadProgress.
So, if any request fails, I would be sending that request again but with changed HTTP Range Header i.e. End would be the same but start would be the size of QTemporaryFile + 1. So this way I think would be able to resume that request from where it failed.[quote]
Personally, I would send each requests sequentially and that too if previous one succeeds. But these are my thoughts as I have never tried the approach that you have followed.[/quote]
Yeah but to get the maximum speed, I have to do this. That's the same thing I guess, other good Download Managers do. -
Ok. Good to go then. May be someone else have other opinions.