QFtp and empty files



  • More then a year passed since my first try with QFtp and this problem is still there.
    Here is the code:
    @QFtp* ftp = new QFtp( this );
    ftp->connectToHost( host );
    ftp->login( login, pass );
    ftp->cd( "/folder_with_thousands_of_files" );
    foreach( QString file, files )
    {
    QFile* f = new QFile( "/output_dir/" + file, this );
    f->open( QIODevice::WriteOnly );
    ftp->get( file, f );
    }@

    I didn't copy some insignificant code but it's still clear what's going on.
    With this code i'm trying to download about 1k files (but no more than maximum amount of opened files).
    After QFtp emit done(false) (of course all previous commandFinished signals were emitted without error and QFile opens everything) there are the same files in /output_dir as in /input_dir except that most of them empty. Some of them just not complete.
    But when i'm trying to download small amount of files (didn't test many values but 10 works) everything is fine.

    Why does it happen and why there is no error reporting when QFtp write to QIODevice less bytes then requested file has?



  • I've figured out that it depends. But i don't understant on what exactly.
    Some executions download only some of files, some download full list. In both cases there are no any errors reported.



  • Two annotations:

    • Your code leaks memory, massively. Each iteration of your loop creates a new QFile object on the heap which is never deleted (at least not in the code you have provided).
    • QFile and QIODevice do some internal buffering. As you do not close nor delete your QFile object these buffers probably never get flushed, which means that parts of the downloaded file reside in memory and do not get written to the disk. This might lead to the behaviour you've described.


  • Sorry that i didn't post that code. Every QFile is closed and deleted when commandFinished signal with respective command id is emitted.



  • It is also very likely that the FTP server does not support downloading 1000 files simultaneously. I'm pretty sure that you get a big bunch of error messages/signals in the meantime that you do not handle.

    Why don't you just download the files sequentially? The bandwidth is limited anyways and you usually gain no significant speedup by downloading more than one file at once.



  • Did not get any messages when every error signal had it's own slot with debug output.

    Yes, i wrote some kind of wrapper around QFtp which has it's internal queue and invokes next QFtp::get() only when there are no any pending operations.

    Hmm. I wonder where my eyes were or what i did wrong but after a few test it seems that everything is fine. Every queued file was completely downloaded.

    The only question i have now is here: http://developer.qt.nokia.com/forums/viewthread/9474/
    Maybe you could take a look at it?


Log in to reply
 

Looks like your connection to Qt Forum was lost, please wait while we try to reconnect.