Model View Design challenge with larger dataset
-
My PC is potato, and VirtualBox running Linux under Windows adds more potato-ness. It seems the Linux VM does use my NVidia card, but barely registers much usage when scrolling. I do not know how your in-memory model could have been written/performed much different than my example.
I am about to give a go with a DB backend. I can barely remember how I have set up MySQL on the Linux box so I may try with SQLite to save brain ache....
-
@swankster said in Model View Design challenge with larger dataset:
Subclassing QAbstractTableModel into my custom myModel then
Before we go any further: Why have you chosen
QAbstractTableModel
instead ofQSqlTableModel
/Query
? Ah, is that because you query the SQL, perhaps with one of these, and then do your "transformation" to CSV/QFile
?Whenever it is you query the database, do you call void QSqlQuery::setForwardOnly(bool forward)? From my experience forward-only is terribly significant for performance on SQL/MySQL. If you have not done that you should try it. I still do not think you should need to do any of CSV-serialization, use of
QFile
orQContiguousCache
.@JonB yes I do query the SQL.
@JonB said in Model View Design challenge with larger dataset:
perhaps with one of these, and then do your "transformation" to CSV/QFile?
I dont understand what you are asking here.
no i have not used void QSqlQuery::setForwardOnly(bool forward)
i will try applying this. thanks for the suggestion. -
@JonB yes I do query the SQL.
@JonB said in Model View Design challenge with larger dataset:
perhaps with one of these, and then do your "transformation" to CSV/QFile?
I dont understand what you are asking here.
no i have not used void QSqlQuery::setForwardOnly(bool forward)
i will try applying this. thanks for the suggestion.@swankster said in Model View Design challenge with larger dataset:
I dont understand what you are asking here.
Normally you use
QSqlQuery
or similar when your model is a SQL database, and that is what you setQTableView
's model to. However, you say you are binding to aQAbstractTableModel
. So I don't know why, or where you do your SQL query. If I understand right, you copy the SQL results to aQFile
(perhaps as CSV format). I can only guess that is what you bind the table view to and hence that is your ownQAbstractTableModel
rather than aQSql...
one, is that right? -
@swankster , @ankou29666
I smacked together the following to test with a database backend:int main(int argc, char *argv[]) { QApplication a(argc, argv); QTableView w; QFile::remove("sqlitedb.db"); QSqlDatabase db = QSqlDatabase::addDatabase("QSQLITE"); db.setHostName("localhost"); db.setDatabaseName("sqlitedb.db"); bool ok = db.open(); Q_ASSERT(ok); QSqlTableModel myModel(nullptr, db); QSqlQuery q; ok = q.prepare("CREATE TABLE t1(row INT, col0 INT, col1 INT, col2 INT, col3 INT, col4 INT, col5 INT, col6 INT, col7 INT, col8 INT, col9 INT)"); if (!ok) qDebug() << q.lastError().driverText() << q.lastError().databaseText(); ok = q.exec(); if (!ok) qDebug() << q.lastError().driverText() << q.lastError().databaseText(); for (int row = 0; row < 56000; row++) { ok = q.prepare("INSERT INTO t1 VALUES(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)"); if (!ok) qDebug() << q.lastError().driverText() << q.lastError().databaseText(); q.bindValue(0, row); for (int col = 1; col <= 10; col++) q.bindValue(col, col); ok = q.exec(); if (!ok) qDebug() << q.lastError().driverText() << q.lastError().databaseText(); } myModel.setTable("t1"); myModel.select(); qDebug() << myModel.rowCount(); w.setModel(&myModel); w.show(); return a.exec(); }
I admit I am using SQLite rather than MySQL, because I can't be bothered to figure out my MySQL. It takes quite a while to insert all the rows, but thereafter the scrolling is regular, fast and smooth. (About 1,000 rows every 2 seconds with my finger on Page Down key on the table view, which I suspect is down to the key auto-repeat rate rather than fetching/displaying rows, and certainly fast enough for the user.)
Note that the initial
qDebug() << myModel.rowCount();
does indeed print256
as I suggested would be the case. The Qt SQL code fetches "buffers" of 256 rows at a time. If/when you want more you have to callfetchMore()
. TheQTableView
is doing this internally in response to the scrolling whenever it reaches the end of the latest buffer. There might be a tiny pause each time, though too fast to tell. And once fully populated I can scroll up & down freely.YMMV. MySQL will certainly require a cross-process access compared against SQLite's direct file access. And if your MySQL database/server is on another machine that will be an overhead.
Why don't you start by seeing how this performs for you?
The only thing I can think of which could cause a "delay/lag" is when the view needs to request the next 256 rows from the db. This would occur during table view scrolling. But it should not become any progressively worse as more records have ben read. You can do all the
fetchMore()
calls yourself in advance (while (canFetchMore()) fetchMore();
) to fully populate the model before displaying the view. From that you should see how much delay they cause on their own and you can see the view's behaviour when there is no database access going on.Over to you, @swankster ! I really don't know exactly what you are doing or what your situation is. My own finding/belief is that it should work as shown in your situation without excessive lag and without all your pre-processing shenanigans.
-
@swankster said in Model View Design challenge with larger dataset:
I dont understand what you are asking here.
Normally you use
QSqlQuery
or similar when your model is a SQL database, and that is what you setQTableView
's model to. However, you say you are binding to aQAbstractTableModel
. So I don't know why, or where you do your SQL query. If I understand right, you copy the SQL results to aQFile
(perhaps as CSV format). I can only guess that is what you bind the table view to and hence that is your ownQAbstractTableModel
rather than aQSql...
one, is that right? -
@swankster , @ankou29666
I smacked together the following to test with a database backend:int main(int argc, char *argv[]) { QApplication a(argc, argv); QTableView w; QFile::remove("sqlitedb.db"); QSqlDatabase db = QSqlDatabase::addDatabase("QSQLITE"); db.setHostName("localhost"); db.setDatabaseName("sqlitedb.db"); bool ok = db.open(); Q_ASSERT(ok); QSqlTableModel myModel(nullptr, db); QSqlQuery q; ok = q.prepare("CREATE TABLE t1(row INT, col0 INT, col1 INT, col2 INT, col3 INT, col4 INT, col5 INT, col6 INT, col7 INT, col8 INT, col9 INT)"); if (!ok) qDebug() << q.lastError().driverText() << q.lastError().databaseText(); ok = q.exec(); if (!ok) qDebug() << q.lastError().driverText() << q.lastError().databaseText(); for (int row = 0; row < 56000; row++) { ok = q.prepare("INSERT INTO t1 VALUES(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)"); if (!ok) qDebug() << q.lastError().driverText() << q.lastError().databaseText(); q.bindValue(0, row); for (int col = 1; col <= 10; col++) q.bindValue(col, col); ok = q.exec(); if (!ok) qDebug() << q.lastError().driverText() << q.lastError().databaseText(); } myModel.setTable("t1"); myModel.select(); qDebug() << myModel.rowCount(); w.setModel(&myModel); w.show(); return a.exec(); }
I admit I am using SQLite rather than MySQL, because I can't be bothered to figure out my MySQL. It takes quite a while to insert all the rows, but thereafter the scrolling is regular, fast and smooth. (About 1,000 rows every 2 seconds with my finger on Page Down key on the table view, which I suspect is down to the key auto-repeat rate rather than fetching/displaying rows, and certainly fast enough for the user.)
Note that the initial
qDebug() << myModel.rowCount();
does indeed print256
as I suggested would be the case. The Qt SQL code fetches "buffers" of 256 rows at a time. If/when you want more you have to callfetchMore()
. TheQTableView
is doing this internally in response to the scrolling whenever it reaches the end of the latest buffer. There might be a tiny pause each time, though too fast to tell. And once fully populated I can scroll up & down freely.YMMV. MySQL will certainly require a cross-process access compared against SQLite's direct file access. And if your MySQL database/server is on another machine that will be an overhead.
Why don't you start by seeing how this performs for you?
The only thing I can think of which could cause a "delay/lag" is when the view needs to request the next 256 rows from the db. This would occur during table view scrolling. But it should not become any progressively worse as more records have ben read. You can do all the
fetchMore()
calls yourself in advance (while (canFetchMore()) fetchMore();
) to fully populate the model before displaying the view. From that you should see how much delay they cause on their own and you can see the view's behaviour when there is no database access going on.Over to you, @swankster ! I really don't know exactly what you are doing or what your situation is. My own finding/belief is that it should work as shown in your situation without excessive lag and without all your pre-processing shenanigans.
@JonB thanks Jon, QSqlTableModel does work smoothly. but I forgot to mention that I do need to do more that just display the table data.
I have 1 field that indicated severity. 1-4. if its value is say 4 the items within the record are displayed as red. Which works real smoothly using QContinousCache
-
@JonB thanks Jon, QSqlTableModel does work smoothly. but I forgot to mention that I do need to do more that just display the table data.
I have 1 field that indicated severity. 1-4. if its value is say 4 the items within the record are displayed as red. Which works real smoothly using QContinousCache
@swankster that's a job for either a proxy model or a QStyledItemDelegate.
-
@swankster that's a job for either a proxy model or a QStyledItemDelegate.
-
I was rather thinking about QIdentityProxyModel. Way more lightweight than QSortFilterProxyModel.
I would test both the identity model and item delegate to see which one is faster.
-
I was rather thinking about QIdentityProxyModel. Way more lightweight than QSortFilterProxyModel.
I would test both the identity model and item delegate to see which one is faster.
@SGaist i have not come across QIdentityProxyModel yet, I will try that. thanks
just a snippet of what i am doing and is working by subclassing AbstractTableModel.
So the row color is dependent on 2 columns. its origin and its severity.QColor EventsModelByDate::eventSeverityTextColor( int severity, QString origin ) const { switch ( severity ) { case 4: /// Info if ( origin == "FRM" || origin == "HET" || origin == "IRS" || origin == "SCS" || origin == "PLG" ) return QColor( 0, 40, 120 );//, Qt::ForegroundRole ); else if ( origin == "TRM" || origin == "TRS" ) return QColor( 80, 136, 180 ); else if ( origin == "VPP" || origin == "DPP" || origin == "PTP" ) return QColor( 80, 136, 180 ); else return QColor( 0, 0, 0 ); break; case 8: /// Warning return QColor( 145, 70, 20 ); break; case 16: /// Error return QColor( 245, 0, 0); break; default: /// Info return QColor( 0, 0, 0 ); break; } }
-
I was rather thinking about QIdentityProxyModel. Way more lightweight than QSortFilterProxyModel.
I would test both the identity model and item delegate to see which one is faster.
@SGaist thanks,
it appears to be coming together using QIdentityProxyModel. the initial load of a small dataset is instantaneous but the larger dataset takes 2-3 seconds to load. I would like to see that less. but it might be acceptable to our main critic. -
I would try the item delegate as well. Only applied to the concerned column. It should only affect the visible rows.
-
I would try the item delegate as well. Only applied to the concerned column. It should only affect the visible rows.
@SGaist applying delegate did not make any difference in opening response. but turning off resizeColumnsToContents made a substantial difference. Now it is basically doing everything I need it to. Now to the fine tunning and clean up. thanks to you all for you input. greatly appreciated.
-
Oh right, I forgot about that one. Depending on the content you can apply the same to rows.
In any case, since your issue is fixed, please mark the thread as solved using the "Topic Tools" button so that other forum members may know a solution has been found :-)