what is a good way to store a huge data with QT C++?
-
I have audio data and I am not sure what is the best way to store it as matrix.
I have 4 large files of recordings from acoustic sensors, each file has 4 channels data interleaved.
I am using Qt C++ to do some treatements of these data. I already made this approach using QVector of QVectors to store the data in. I tried this code:QVector<QVector<int>> buffer(16) // 4 * 4 : numberOfChannels * numberOfFiles for(int i = 0 ; i < 4 ; i++){ QFile file(fileList[i]); // fileList is QList of QStrings contains 4 files path if(file.open(QIODevice::ReadOnly)){ int k = 0; while(!file.atEnd()){ QByteArray sample = file.read(depth/8); // depth here is 24 int integerSample = convertByteArrayToIntFunction(sample); buffer[4 * i + (K%4)].append(integerSample); k++; } } }To have at the end this matrix of 16 columns like below(f:file, c:channel):
f1c0 | f1c1 | f1c2 | f1c3 | f2c0 | f2c1 | ... | f4c2 | f4c3But this approach it takes ages for large files of few gigabytes. I am wondering if there is another efficient way to fulfill this task and gain a lot of time.
Thanks in advance. -
I have audio data and I am not sure what is the best way to store it as matrix.
I have 4 large files of recordings from acoustic sensors, each file has 4 channels data interleaved.
I am using Qt C++ to do some treatements of these data. I already made this approach using QVector of QVectors to store the data in. I tried this code:QVector<QVector<int>> buffer(16) // 4 * 4 : numberOfChannels * numberOfFiles for(int i = 0 ; i < 4 ; i++){ QFile file(fileList[i]); // fileList is QList of QStrings contains 4 files path if(file.open(QIODevice::ReadOnly)){ int k = 0; while(!file.atEnd()){ QByteArray sample = file.read(depth/8); // depth here is 24 int integerSample = convertByteArrayToIntFunction(sample); buffer[4 * i + (K%4)].append(integerSample); k++; } } }To have at the end this matrix of 16 columns like below(f:file, c:channel):
f1c0 | f1c1 | f1c2 | f1c3 | f2c0 | f2c1 | ... | f4c2 | f4c3But this approach it takes ages for large files of few gigabytes. I am wondering if there is another efficient way to fulfill this task and gain a lot of time.
Thanks in advance.@Aa-Aa Well, don't read whole file at once. Read it in chunks.