Important: Please read the Qt Code of Conduct - https://forum.qt.io/topic/113070/qt-code-of-conduct

reading from file directly vs through a stream



  • is it faster to read files through a stream than directly when the file is large?
    do streams provide an advantage?



  • @user4592357
    Nothing can read from a file faster from a stream than "directly", after all the stream still has to do that. But it depends how & why you do that, a stream wrapper can provide other advantages. I imagine the stream implementation is pretty efficient for direct sequential reading from file anyway. The exception would be if you do file seeking while reading, but I don't imagine you are doing that.


  • Lifetime Qt Champion

    Hi
    Streams often provide more object-orientated features than the "raw "read and write of a plain file.
    like adding new known types
    QDataStream & operator<< (QDataStream& stream, const MyType& type);

    About speed.
    It mostly depends on what is being read and how.

    reading a file containing structured objects
    will not be (much) faster with raw file

    but say we have a file of 10.000 ints

    then using
    qint64 QIODevice::read(char *data, qint64 maxSize)
    to read them all into an internal array in one big read will be faster

    than code that read int by int using streams.
    but streams could also read one big array so..

    In the end, its the disk speed that matters for how fast it is.

    So often it depends on what you want to read and write if a stream or raw file would be "best".



  • @mrjj said in reading from file directly vs through a stream:

    than code that read int by int using streams.

    • I think you can assume that a stream reader will implement reading an array of ints as a read(sizeof-array-of-ints).

    • Whether it does or does not, I think you can assume that the low-level stream will physically call something like read(large-buffer-size) for its operations, and then dosh out data from that buffer as necessary for calls, moving an internal read pointer around and re-filling the next buffer's worth when required.

    Else it's not a very good/efficient stream reader!

    Think of, say, the implementation of the stdio buffered FP (or presumably the C++ cin/istream etc.) access layer on top of the low-level read and fd primitives.


  • Moderators

    With I/O for me the rule of thumb is always:

    • if you're on fixed hardware - just measure
    • if not (like on PC) - just measure on as many different devices as possible.

    But yeah, although it varies from implementation to implementation, streams are a convenience oriented layer above raw reads so intuition would suggest they are at most the same and at worst a lot slower due to additional overhead, inefficient buffering, flushes, stalls etc. QDataStream in particular has the additional type checking and parsing so it's not gonna be as fast as a big blob raw binary read.


Log in to reply