Qt Forum

    • Login
    • Search
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Search
    • Unsolved

    Unsolved Upload large files HTTP request, without increasing the application memory.

    General and Desktop
    qnetworkrequest upload desktop http c++
    3
    9
    5952
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • T
      Tusharh last edited by Tusharh

      My program needs to upload huge files of 1GB,2GB,etc . I have tried using QHttpMultipart request & other PUT & POST requests on QNetworkAccessManager. Every time a QNetworkRequest is build, the application memory shoots up from 20MB to 1GB or 2GB as per the file, which leads to my application crashing (WINDOWS QT APP).

      QHttpMultiPart *multiPart = new QHttpMultiPart(QHttpMultiPart::FormDataType);
      
      QHttpPart textPart;
      textPart.setHeader(QNetworkRequest::ContentDispositionHeader, QVariant("form-data; name=\"text\""));
      textPart.setBody("my text");
      
      QHttpPart imagePart;
      imagePart.setHeader(QNetworkRequest::ContentTypeHeader, QVariant("image/jpeg"));
      imagePart.setHeader(QNetworkRequest::ContentDispositionHeader, QVariant("form-data; name=\"image\""));
      QFile *file = new QFile("LargeFile.zip");
      file->open(QIODevice::ReadOnly);
      imagePart.setBodyDevice(file); 
      file->setParent(multiPart); 
      
      multiPart->append(textPart);
      multiPart->append(imagePart);
      
      QUrl url("http://my.server.tld");
      QNetworkRequest request(url);
      
      QNetworkAccessManager manager;
      QNetworkReply *reply = manager.post(request, multiPart);
      multiPart->setParent(reply); 
      
      **HUGE increase in application memory equivalent to the file size at this point after POST request is fired,as whole file is read & added into request body**
      
      What I found in C# is that, it's possible to upload huge files without increasing or reading the whole file. They keep on writing on the http request stream in chunks.
      
      In this code whatever may the File size be in GBs, application memory never goes beyond 40MB.
      
      public void upload(string url, string imagename, string imagepath, string finalType)
      {
      	FileStream fileStream = null;
      	Stream rs = null;
      	WebResponse lobjResponse = null;
      	string lstrResponse = string.Empty;
      	bool lblnIsConnected = true;
      	long mlngBytesUploaded = 0;
      	long llngContentLength = 0;
      	long[] llngarrProgressStatusData = new long[2];
      	string pstrResponse = string.Empty;
      
      	try
      	{
      		long lintContentLenght = 0;
      		string boundary = "---------------------------" + DateTime.Now.Ticks.ToString("x");
      		byte[] boundarybytes = System.Text.Encoding.ASCII.GetBytes("\r\n--" + boundary + "\r\n");
      
      		string formdataTemplate = "Content-Disposition: form-data; name=\"{0}\"\r\n\r\n{1}";
      		string formitem = string.Format(formdataTemplate, "abc", "abc-val");
      		byte[] formitembytes = System.Text.Encoding.UTF8.GetBytes(formitem);
      
      		string headerTemplate = "Content-Disposition: form-data; name=\"{0}\"; filename=\"{1}\"\r\nContent-Type: {2}\r\n\r\n";
      		string header = string.Format(headerTemplate, imagename, imagename, finalType);
      		byte[] headerbytes = System.Text.Encoding.UTF8.GetBytes(header);
      
      		byte[] trailer = System.Text.Encoding.ASCII.GetBytes("\r\n--" + boundary + "--\r\n");
      
      		fileStream = new FileStream(imagepath, FileMode.Open, FileAccess.Read);
      
      		lintContentLenght += boundarybytes.Length;
      		lintContentLenght += formitembytes.Length;
      		lintContentLenght += boundarybytes.Length;
      		lintContentLenght += headerbytes.Length;
      		lintContentLenght += fileStream.Length;
      		lintContentLenght += trailer.Length;
      		
      		wr = (HttpWebRequest)WebRequest.Create(url);
      		wr.AllowWriteStreamBuffering = false;
      		wr.ContentType = "multipart/form-data; boundary=" + boundary;
      		wr.ContentLength = lintContentLenght;
      		wr.Method = "POST";
      		wr.KeepAlive = true;
      		
      		  using (rs = wr.GetRequestStream())
      		  {
      			  rs.Write(boundarybytes, 0, boundarybytes.Length);
      			  rs.Write(formitembytes, 0, formitembytes.Length);
      			  rs.Write(boundarybytes, 0, boundarybytes.Length);
      			  rs.Write(headerbytes, 0, headerbytes.Length);
      
      			  int bytesRead = 0;
      			  byte[] bufferSize = new byte[4096];
      
      			  llngContentLength = wr.ContentLength;
      			  llngarrProgressStatusData[0] = llngContentLength;
      
                                // BECAUSE OF READING FILE DATA in chunks & WRITING IT to the STREAM. HERE WHOLE FILE ISN'T READ & SET INTO REQUEST like in QHTTPMULTIPART request. Application memory remains constant in C# code.** 
      			  while ((bytesRead = fileStream.Read(bufferSize, 0, bufferSize.Length)) != 0) 
      			  {
      
      				  if (!lblnIsConnected) break;
      					mlngBytesUploaded += bytesRead;
      				  rs.Write(bufferSize, 0, bytesRead);
      				  
      				  if (mlngBytesUploaded > llngContentLength)
      					  mlngBytesUploaded = llngContentLength;
      					  
      				  llngarrProgressStatusData[1] = mlngBytesUploaded;
      				  NotifyHTTPProgressBarUpdate(mlngBytesUploaded, llngContentLength);
      			  }
      
      			  fileStream.Close();
      
      			  if (lblnIsConnected)
      			  {
      				  rs.Write(trailer, 0, trailer.Length);
      				  rs.Close();
      			  }
      		  }
      	}
      }
      

      Sending file chunks is an option but currently I don't have that support on my server.

      Can you please suggest me a similar way of implementing above C# code in QT to upload file in a stream over HTTP PUT/POST request on a given URL.

      Thanks in advance!

      1 Reply Last reply Reply Quote 0
      • sierdzio
        sierdzio Moderators last edited by sierdzio

        QNAM accepts a QIODevice in post() and put() - link.

        So when you pass a QFile instance to that method, it works using streaming operators (without loading the whole file into RAM). Have you tried that approach?

        (Z(:^

        raven-worx 1 Reply Last reply Reply Quote 1
        • raven-worx
          raven-worx Moderators @sierdzio last edited by raven-worx

          @sierdzio said in Upload large files HTTP request, without increasing the application memory.:

          it work using streaming operators (without loading the whole file into RAM)

          that's what i personally would also expect using the above code. But the docs are not clear regarding this.

          --- SUPPORT REQUESTS VIA CHAT WILL BE IGNORED ---
          If you have a question please use the forum so others can benefit from the solution in the future

          sierdzio 1 Reply Last reply Reply Quote 0
          • sierdzio
            sierdzio Moderators @raven-worx last edited by

            @raven-worx said in Upload large files HTTP request, without increasing the application memory.:

            @sierdzio said in Upload large files HTTP request, without increasing the application memory.:

            it work using streaming operators (without loading the whole file into RAM)

            that's what i personally would also expect using the above code. But the docs are not clear regarding this.

            I think this line is quite clear:

            because the content is not copied when using this method, but read directly from the device

            Anyway, I did not link to QHttpPart, I linked to QNAM directly. Both approaches seem likely to be good.

            (Z(:^

            1 Reply Last reply Reply Quote 0
            • T
              Tusharh last edited by

              @sierdzio I tried passing QFile instance to POST & PUT methods of QNetworkAcessManager. No success. Whole file is loaded into memory.

              sierdzio 1 Reply Last reply Reply Quote 0
              • sierdzio
                sierdzio Moderators @Tusharh last edited by

                @Tusharh said in Upload large files HTTP request, without increasing the application memory.:

                @sierdzio I tried passing QFile instance to POST & PUT methods of QNetworkAcessManager. No success. Whole file is loaded into memory.

                OK, I believe it is a bug, then, please report it.

                (Z(:^

                T 1 Reply Last reply Reply Quote 0
                • T
                  Tusharh @sierdzio last edited by

                  @sierdzio I'm using QT 5.4.1. My application has some dependency on 5.4.1. How can I know if this bug is resolved in 5.7(besides installing QT 5.7 on my machine)?

                  raven-worx 1 Reply Last reply Reply Quote 0
                  • raven-worx
                    raven-worx Moderators @Tusharh last edited by

                    @Tusharh
                    Is this the whole code you are using? Because even when only one part of the whole multi-part object is sequential the whole multi-part would be sequential and thus loaded into memory.

                    --- SUPPORT REQUESTS VIA CHAT WILL BE IGNORED ---
                    If you have a question please use the forum so others can benefit from the solution in the future

                    T 1 Reply Last reply Reply Quote 0
                    • T
                      Tusharh @raven-worx last edited by

                      @raven-worx Yes, this is the code. I also did separate new code QNAM()->PUT(request,QFile*) & QNAM()->POST(request,QFile*) on my QT 5.4.1 setup. Everytime after executing the request application memory goes on increasing till it is equivalent to given file size(here I used 1GB file & memory increased by 1GB).

                      1 Reply Last reply Reply Quote 0
                      • First post
                        Last post