Unsolved Upload file via HTTP - callback/signal for feeding data from buffer (not via QFile)
I'm porting a GUI Windows application to Linux, and am using Qt, mainly for it's GUI framework, but also for all the good Qt has to offer.
The original code uses libcurl for HTP communication, uploading files. The curl_multi_ asynchronous API is used, for non-blocking behavior.
Files, which are dynamically created, are being uploaded. a WRITE callback is being set to cURL, and is called so the user will 'feed' cURL with more data to upload.
What I know and have seen in Qt, is that a QFile is being passes, and everything is being done 'behind the scene'. signals of progress and finish are emitted, as apposed to a GET request, where readyRead() signal is emitted.
I thought of creating a temporary file, just before using the standard Qt method, and passing this file to the NAM, but am hoping for a more elegant solution.
Is there a way to use Qt the way cURL works?
Hi and welcome to devnet forum
Why do you intend to create a temporary file before uploading with Qt?
Hi and welcome to devnet,
IIRC, you can use either a QIODevice, a QByteArray or QHttpMultiPart. This should allow you to build your buffer and send the data the way you want.
Hope it helps
@koahnig The current code is built for the cURL API, which sends (or buffers internally) the data in chunks, calling the WRITE callback for the user to give it another chunk of data to send.
I need somewhere to change the behavior, so better not change the 'heart' of the code, but the edge, so I thought of using the same WRITE callback to read all the data into a file, and then pass that file to the QNetworkAccessManager.