How to write entire QVector to a binary file?
-
@J-Hilk Thanks, I will go through the document and see if it can be helpful to me.
@Christian-Ehrlicher @JonB The only other way is to iterate through aQVector<double>
, and is a time-consuming process, especially with millions of data points, and that's why I am interested in doing it directly. -
@CJha said in How to write entire QVector to a binary file?:
and is a time-consuming process, especially with millions of data points, and that's why I am interested in doing it directly.
Did you look at the QDataStream implementation? It does exactly the same... so why should this be faster?
-
@CJha said in How to write entire QVector to a binary file?:
I have to read it in Matlab.
If I remeber correctly Matlab reads csv (aka text) so
QTextStream
is what you want. something like:QFile file("BinTest.bin"); file.open(QIODevice::WriteOnly | QIODevice::Text); QTextStream out(&file); QVector<double> vec; for(int ii = 0; ii < 10; ++ii){ vec << ii * 0.33; } for(auto&& val : vec) out << val << ','; file.close();
-
@Christian-Ehrlicher I didn't know that before I saw the source code for
QDataStream
. I assumed since for aQVector
data points are in adjacent memory positions pushing an entire vector into a binary file must be faster than iterating over it. -
@CJha said in How to write entire QVector to a binary file?:
The only other way is to iterate through a QVector<double>, and is a time-consuming process, especially with millions of data points
Yep. That's what has to be done, and it's what the
<<
serializer does, as @J-Hilk showed you. The only other way would be if you can get the address of contiguousQVector
memory and save from there, which I'm guessing can be done. However......If @VRonin's latest post is correct and you're supposed to produce text instead to export then you cannot help but do it one-by-one....
P.S.
BTW, you'd have to test, but my guess is that code to output data points one-by-one instead of in a contiguous clump is not what would be slow over 1,000,000 points --- rather, the size of the output written to file will be what is significant.... -
You're correct with the adjacent memory but QVector can also hold objects so memcpy'ing it out will not work there. You can use memcpy if you want in your case but QDataStream is generic and has no optimizations for such things.
-
@VRonin Yes, but writing a .csv file takes much longer than writing a binary file (almost 10 times more for large data sets). I have tried and tested it. I am gathering data at a much faster rate, up to 1 million doubles per second and I have to write it to a file continuously for hours, and this file will be analysed in Matlab by researchers. If I write 1 million data points in a .csv file it takes around 4 seconds while doing the same in a binary file takes around 400 milliseconds.
-
@CJha said in How to write entire QVector to a binary file?:
I am gathering data at a much faster rate, up to 1 million doubles per second and I have to write it to a file continuously for hours
actually stop right here!
if this program is used more than once, you're going to destroy your HD/SSD very quickly!
I'm sure there's an other - in memory - way to hand over those data points
-
@CJha said in How to write entire QVector to a binary file?:
Matlab supports the binary format
Do you have its specification?
-
@J-Hilk I am not sure what you mean by
if this program is used more than once, you're going to destroy your HD/SSD very quickly!
Given that 1 million doubles are 8 million bytes, I think modern processors and disk drives can handle such speed easily.
-
@CJha
You may (well) know more than I, but can MatLab read and process 8MB of new data per second, at the same time as something else is producing it? And, separately, do you really generate 1 million new data points per second?Also, as @J-Hilk said, wouldn't sending a pipe stream (e.g. a socket?) be better than writing to file and reading back? Does Matlab accept incoming data elsewhere than in a file?
-
@jsulm It is highly compatible. Here are the links for fopen, fread, fseek. I all these I can specify the format, ByteOrder, size of data (such as int, double, etc), and quite a few other things.
I don't think Matlab is the restrictive thing here, I can read any type of binary file in Matlab as long as I know how it is written. -
@CJha
The code to write aQVector
to file in the way you want, as fast as possible in one blob not one-by-one, is given in e.g. https://www.qtcentre.org/threads/65713-Output-a-QVector-of-doubles-to-a-binary-file-(without-using-QDatastream)?p=289540#post289540 :qint64 bytesWritten = file.write(reinterpret_cast<const char*>(vec.constData()), sizeof(double) * vec.size());
EDIT I think you will want
reinterpret_cast<>
rather thanstatic_cast<>
here as shown in that post, so I have altered the code line to use that. -
@JonB No, Matlab is going to read it at a later time. When data is being generated it is just stored in a binary file for later use by Matlab. And yes, Matlab is slower but it doesn't matter if it takes 1 second or 1 day to read the file as the researchers can just start loading the file in the night and come back in the morning to work on it (many researchers wait for times like 24 to 36 hours for files to get processed).
And yes, I am generating data at 1 million doubles per second. I am using National Instruments and Measurement Computing DAQ boards, controlling both through Qt and C++ and these boards are capable of generating 1 million doubles per second.
-
@CJha said in How to write entire QVector to a binary file?:
@J-Hilk I am not sure what you mean by
if this program is used more than once, you're going to destroy your HD/SSD very quickly!
Given that 1 million doubles are 8 million bytes, I think modern processors and disk drives can handle such speed easily.
its not about the speed, its about the amount of times written into the cell, Samsung for examples says their ssd's are "built to handle 150 terabytes written" with, lets say 1 million points of double (8 bytes each) per second would mean your ssd is done for in roughly 200 days, instead of the approximated 10 years.
also you have to coordinate read and write access of the file, so that Matlab and your Qt Programm to not try to access the file at the same time with potential data loss etc
-
@J-Hilk That's a good point, but it's not the case for me as the data write and data read happens at different times. Also, SSD lifetime doesn't matter as these researchers have lots of funding and SSD is a cheap item for them. My job is to give them what they ask for, and if they ruin their SSD in 200 days that is up to them (of course I will tell them that it can ruin their SSD fast but that's all I can do).