I use QAbstractItemModel add 10,000,000 row * 10 column datas,It consume 1G memory
-
I use QAbstractItemModel add 10,000,000 row * 10 column datas,It consume 1G memory。
I want save my memory, but I have no idea.@IGNB
10,000,000 * 10 == 100,000,000 values. 1GB is ~1,000,000,000 bytes, so on average each of your values occupies 10 bytes, including whatever overhead is necessary. So what do you expect, the 1GB consumption does not seem unreasonable for this much data.If you don't want to use that much memory, you must either
-
Reduce the average size of each value (probably marginal savings, there are certain fixed minima for the size of each item plus overheads), reduce the number of columns, or reduce the number of rows.
-
Put the data into external storage instead of memory. The obvious choice here might be a SQLite database, for which you can use
QSqlDatabase
and judiciousQSqlQuery...
classes/queries which do not bring too much into memory.
-
-
@IGNB
10,000,000 * 10 == 100,000,000 values. 1GB is ~1,000,000,000 bytes, so on average each of your values occupies 10 bytes, including whatever overhead is necessary. So what do you expect, the 1GB consumption does not seem unreasonable for this much data.If you don't want to use that much memory, you must either
-
Reduce the average size of each value (probably marginal savings, there are certain fixed minima for the size of each item plus overheads), reduce the number of columns, or reduce the number of rows.
-
Put the data into external storage instead of memory. The obvious choice here might be a SQLite database, for which you can use
QSqlDatabase
and judiciousQSqlQuery...
classes/queries which do not bring too much into memory.
-
-
@JonB Thank you much for your advice.
how about put the model(which consume almost memory) into server,and the client is for display only。
Is this feasible?@IGNB well you can put the data into a distant server, but the model itself will have to be in client's memory. No matter you choose a local or distant storage, you will need to fetch the data you need to display (and only these data coz you won't need to display all at the same time). But at some time, the data to display have to be in memory, and so does the model.
You just don't need your 10M rows in memory at the same time, so in short, import to memory just what has to be displayed, no more no less.
You can use a direct distant database (SQL or any other type) or using an API (REST or GraphQL) depending on your needs.
-
@IGNB well you can put the data into a distant server, but the model itself will have to be in client's memory. No matter you choose a local or distant storage, you will need to fetch the data you need to display (and only these data coz you won't need to display all at the same time). But at some time, the data to display have to be in memory, and so does the model.
You just don't need your 10M rows in memory at the same time, so in short, import to memory just what has to be displayed, no more no less.
You can use a direct distant database (SQL or any other type) or using an API (REST or GraphQL) depending on your needs.
Or simply filter before you add them to a model - noone is able to handle 10m rows in a useful way.
-
@IGNB said in I use QAbstractItemModel add 10,000,000 row * 10 column datas,It consume 1G memory:
how about put the model(which consume almost memory) into server,and the client is for display only。
Is this feasible?Yes, this is kind of what @JonB was suggesting with the SQLite as the "server", even if it might be local.
Disclaimer: much of the following is from fairly old memory, but should help steer you in the right direction, hopefully ;)
The idea is that model tells the view that data is available, but doesn't actually fetch it from the source (could be a remote server, or a local SQLite, etc) unless the view asks for it.
If the model knows how many rows there are (eg a SQL query result indicated that there's 10M rows) then QAbstractItemModel::rowCount() returns the total count (eg 10M), even though very little data (possibly none at all) has been fetched yet - only whatever the view needs to render its first view. In this scenario, the view's scrollbar (if it has one) scales indicate the 10M, so if the user drags it down to somewhere down the bottom, then the view asks the model for the items at that position, and the model fetches those rows (and often a few either side) from the server. (If I remember correctly, for this to be performant, you really want to set the ResizeMode of the view's vertical head to QHeaderView::Fixed)
If the model can't tell in advance how many rows there will be in total (eg using a SQL query's scrollable cursor), then QAbstractItemModel::canFetchMore() returns
true
until there's no more rows available, and so if users keeps scrolling the view, the view will keep asking the model if there's more, requesting more via QAbstractItemModel::fetchMode(), and updating (re-scaling) the vertical scrollbar (if any) as the set of rows expands.Of course you can implement those behaviours yourself by sub-classing QAbstractItemModel, but the QSqlQueryModel class does something along those lines for you, depending on what the database engine supports. From the docs:
If the database doesn't return the number of selected rows in a query, the model will fetch rows incrementally. See fetchMore() for more information.
Either way, you might need to add some code to free and invalidate some of the earlier indexes that haven't been used in a while, if memory growth is an issue for you (how many people will actually scroll through all 10M rows? How much RAM can you depend on your users having?)
Cheers.
PS I did quite of bit of benchmarking of Qt's handing of millions of rows in models/views back in the Qt 4.3 era, and was frankly really impressed. That capability alone was the reason my employer chose to use Qt for that project (and a number of projects that followed it), and the reason I learned, and fell in love with Qt :D
-
@IGNB said in I use QAbstractItemModel add 10,000,000 row * 10 column datas,It consume 1G memory:
how about put the model(which consume almost memory) into server,and the client is for display only。
Is this feasible?Yes, this is kind of what @JonB was suggesting with the SQLite as the "server", even if it might be local.
Disclaimer: much of the following is from fairly old memory, but should help steer you in the right direction, hopefully ;)
The idea is that model tells the view that data is available, but doesn't actually fetch it from the source (could be a remote server, or a local SQLite, etc) unless the view asks for it.
If the model knows how many rows there are (eg a SQL query result indicated that there's 10M rows) then QAbstractItemModel::rowCount() returns the total count (eg 10M), even though very little data (possibly none at all) has been fetched yet - only whatever the view needs to render its first view. In this scenario, the view's scrollbar (if it has one) scales indicate the 10M, so if the user drags it down to somewhere down the bottom, then the view asks the model for the items at that position, and the model fetches those rows (and often a few either side) from the server. (If I remember correctly, for this to be performant, you really want to set the ResizeMode of the view's vertical head to QHeaderView::Fixed)
If the model can't tell in advance how many rows there will be in total (eg using a SQL query's scrollable cursor), then QAbstractItemModel::canFetchMore() returns
true
until there's no more rows available, and so if users keeps scrolling the view, the view will keep asking the model if there's more, requesting more via QAbstractItemModel::fetchMode(), and updating (re-scaling) the vertical scrollbar (if any) as the set of rows expands.Of course you can implement those behaviours yourself by sub-classing QAbstractItemModel, but the QSqlQueryModel class does something along those lines for you, depending on what the database engine supports. From the docs:
If the database doesn't return the number of selected rows in a query, the model will fetch rows incrementally. See fetchMore() for more information.
Either way, you might need to add some code to free and invalidate some of the earlier indexes that haven't been used in a while, if memory growth is an issue for you (how many people will actually scroll through all 10M rows? How much RAM can you depend on your users having?)
Cheers.
PS I did quite of bit of benchmarking of Qt's handing of millions of rows in models/views back in the Qt 4.3 era, and was frankly really impressed. That capability alone was the reason my employer chose to use Qt for that project (and a number of projects that followed it), and the reason I learned, and fell in love with Qt :D
@Paul-Colby
Everything you have written is correct/useful for the OP. However, there is one "wrinkle" which I discussed with someone else yesterday that you/the OP/a reader should be aware of:requesting more via
QAbstractItemModel::fetchMode()
but theQSqlQueryModel
class does something along those lines for you
you might need to add some code to free and invalidate some of the earlier indexes that haven't been used in a while, if memory growth is an issue for youThere is a potential problem if you pick
QSqlQueryModel
. Please read post at https://forum.qt.io/topic/144535/qsqlquerymodel-incrementally-remove-old-rows/11QSqlQueryModel
does not offer access to removing/reducing the rows previously read: you can fill it incrementally viafetchMore()
but you cannot remove rows dynamically.
QSqlQuery
can be used instead by the OP to implement their own which does allow disposing of previously read rows. -
@Paul-Colby
Everything you have written is correct/useful for the OP. However, there is one "wrinkle" which I discussed with someone else yesterday that you/the OP/a reader should be aware of:requesting more via
QAbstractItemModel::fetchMode()
but theQSqlQueryModel
class does something along those lines for you
you might need to add some code to free and invalidate some of the earlier indexes that haven't been used in a while, if memory growth is an issue for youThere is a potential problem if you pick
QSqlQueryModel
. Please read post at https://forum.qt.io/topic/144535/qsqlquerymodel-incrementally-remove-old-rows/11QSqlQueryModel
does not offer access to removing/reducing the rows previously read: you can fill it incrementally viafetchMore()
but you cannot remove rows dynamically.
QSqlQuery
can be used instead by the OP to implement their own which does allow disposing of previously read rows.@JonB I have seen a software developed in Delphi, that used only 10-30M of memory to query and display a Table of 100,000*100.
Only in the first few seconds before display, memory climbed to 200-300M, and then dropped after display.
And do sort, filter, search on that Table, memory is not very high.
can I use Qt also do the same? -
@JonB I have seen a software developed in Delphi, that used only 10-30M of memory to query and display a Table of 100,000*100.
Only in the first few seconds before display, memory climbed to 200-300M, and then dropped after display.
And do sort, filter, search on that Table, memory is not very high.
can I use Qt also do the same?@IGNB
That would imply the data is in an external database (on local disk, or on a connected server). You can doubtless achieve something like under Qt withQSql...
classes, though as I wrote above you may have to useQSqlQuery
s with your own model so that you can control memory usage rather than directly use aQSqlQueryModel
. Sorting/filtering/search can be done most efficiently by passing a SQLSELECT
statement with suitable parameters to the database level rather than reading all records in and doing it on an in-memory table yourself. -
@IGNB
That would imply the data is in an external database (on local disk, or on a connected server). You can doubtless achieve something like under Qt withQSql...
classes, though as I wrote above you may have to useQSqlQuery
s with your own model so that you can control memory usage rather than directly use aQSqlQueryModel
. Sorting/filtering/search can be done most efficiently by passing a SQLSELECT
statement with suitable parameters to the database level rather than reading all records in and doing it on an in-memory table yourself.