Unsolved QScintilla Reading file incremetally to avoid performance issue
-
I am using QScintilla , I am dumping a big file (1500 MB) line my line and the tools hangs
if (FILE* fp = UFile::open(ofilename.toLatin1(), "r")) {
QTextStream ts(fp, QIODevice::ReadOnly);-
setText(ts.readAll());
-
//setText(ts.readAll());
-
QString line ;
-
do {
-
line = ts.readLine();
-
//insert(line
-
append(line);
-
append("\n");
-
} while (!line.isNull()); setModified(FALSE); UFile::close(fp); d_filename = ofilename;
is there any way we can avoid the hang . Is there any we can ask Q Scintilla to load the file incrementally as the scroll bar moves ( how it works in gvim of Linux) where colors come down incrementatlly
-
-
-
Still it is taking time can we have progress bar attached to the loading the file
-
Here is a link to QProgressDialog i the detailed description is also an example for use.
You can also display some sort of an increasing counter to qDebug() if you have a console window. This is probably easier for you start with.
int count = 0; do { line = ts.readLine(); //insert(line append(line); append("\n"); qDebug() << ++count; } while ( ! ts.atEnd());
Also I would start with a much smaller file to see if the main instructions are working as expected. When you run your program in the debugger with debug mode it will take some time. It all depends on your patience.
-
I changed the code to this now
clock_t begin = clock();
QFile data(ofilename.toLatin1().constData());
QString line;
if (data.open(QFile::ReadOnly)) {
setText(data.readAll());
setModified(FALSE);
data.close();
d_filename = ofilename;
emit fileNameChanged(ofilename);
}
clock_t end = clock();
double elapsed_secs = double(end - begin) / CLOCKS_PER_SEC;
printf("The Slow operartion took %f",elapsed_secs) ;and it is taking 99 seconds for a 1100 MB file . Can u sugggest
-
I changed the code to this now
clock_t begin = clock();
QFile data(ofilename.toLatin1().constData());
QString line;
if (data.open(QFile::ReadOnly)) {
setText(data.readAll());
setModified(FALSE);
data.close();
d_filename = ofilename;
emit fileNameChanged(ofilename);
}
clock_t end = clock();
double elapsed_secs = double(end - begin) / CLOCKS_PER_SEC;
printf("The Slow operartion took %f",elapsed_secs) ;and it is taking 99 seconds for a 1100 MB file . Can u suggest a better code
-
Hi
This is a compact as it gets.
I dont think you can make it run faster as it must first read all 1100 mb in
and then setText might also copy. -
You may check which part of the code is really taking time by going back to your reading line-by-line version and summing up the required time slices. Possibly setText allows some improvements. But otherwise @mrjj is correct.
Also you are not stating how you have compiled the source. If it is still debug mode, a release compiled version will help significantly. If it is already release you save a bit by using different compile switches for speeding up. However, you have to realize that those different switches may buy some percentage but typically not factors.
Finally you have to consider also your hardware, e.g. a slow disk or network might be the bottleneck.
-
So as I understand the following code is the best code for this solution, as we have not incrementatl loading in Qscintilla
I changed the code to this now
clock_t begin = clock();
QFile data(ofilename.toLatin1().constData());
QString line;
if (data.open(QFile::ReadOnly)) {
setText(data.readAll());
setModified(FALSE);
data.close();
d_filename = ofilename;
emit fileNameChanged(ofilename);
} -
@Qt-Enthusiast said:
Yes as the only alternative is calling Qscintilla::append and you gain nothing by adding all lines, one by one.
Its not partial loading as in the end you will have loaded all anyway.