Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
  • Search
  • Get Qt Extensions
  • Unsolved
Collapse
Brand Logo
  1. Home
  2. Qt Development
  3. General and Desktop
  4. I use QAbstractItemModel add 10,000,000 row * 10 column datas,It consume 1G memory
Forum Updated to NodeBB v4.3 + New Features

I use QAbstractItemModel add 10,000,000 row * 10 column datas,It consume 1G memory

Scheduled Pinned Locked Moved Unsolved General and Desktop
10 Posts 5 Posters 862 Views 2 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • I Offline
    I Offline
    IGNB
    wrote on last edited by
    #1

    I use QAbstractItemModel add 10,000,000 row * 10 column datas,It consume 1G memory。
    I want save my memory, but I have no idea.

    JonBJ 1 Reply Last reply
    0
    • I IGNB

      I use QAbstractItemModel add 10,000,000 row * 10 column datas,It consume 1G memory。
      I want save my memory, but I have no idea.

      JonBJ Offline
      JonBJ Offline
      JonB
      wrote on last edited by
      #2

      @IGNB
      10,000,000 * 10 == 100,000,000 values. 1GB is ~1,000,000,000 bytes, so on average each of your values occupies 10 bytes, including whatever overhead is necessary. So what do you expect, the 1GB consumption does not seem unreasonable for this much data.

      If you don't want to use that much memory, you must either

      • Reduce the average size of each value (probably marginal savings, there are certain fixed minima for the size of each item plus overheads), reduce the number of columns, or reduce the number of rows.

      • Put the data into external storage instead of memory. The obvious choice here might be a SQLite database, for which you can use QSqlDatabase and judicious QSqlQuery... classes/queries which do not bring too much into memory.

      I 1 Reply Last reply
      3
      • JonBJ JonB

        @IGNB
        10,000,000 * 10 == 100,000,000 values. 1GB is ~1,000,000,000 bytes, so on average each of your values occupies 10 bytes, including whatever overhead is necessary. So what do you expect, the 1GB consumption does not seem unreasonable for this much data.

        If you don't want to use that much memory, you must either

        • Reduce the average size of each value (probably marginal savings, there are certain fixed minima for the size of each item plus overheads), reduce the number of columns, or reduce the number of rows.

        • Put the data into external storage instead of memory. The obvious choice here might be a SQLite database, for which you can use QSqlDatabase and judicious QSqlQuery... classes/queries which do not bring too much into memory.

        I Offline
        I Offline
        IGNB
        wrote on last edited by
        #3

        @JonB Thank you much for your advice.
        how about put the model(which consume almost memory) into server,and the client is for display only。
        Is this feasible?

        ? 1 Reply Last reply
        0
        • I IGNB

          @JonB Thank you much for your advice.
          how about put the model(which consume almost memory) into server,and the client is for display only。
          Is this feasible?

          ? Offline
          ? Offline
          A Former User
          wrote on last edited by A Former User
          #4

          @IGNB well you can put the data into a distant server, but the model itself will have to be in client's memory. No matter you choose a local or distant storage, you will need to fetch the data you need to display (and only these data coz you won't need to display all at the same time). But at some time, the data to display have to be in memory, and so does the model.

          You just don't need your 10M rows in memory at the same time, so in short, import to memory just what has to be displayed, no more no less.

          You can use a direct distant database (SQL or any other type) or using an API (REST or GraphQL) depending on your needs.

          Christian EhrlicherC 1 Reply Last reply
          0
          • ? A Former User

            @IGNB well you can put the data into a distant server, but the model itself will have to be in client's memory. No matter you choose a local or distant storage, you will need to fetch the data you need to display (and only these data coz you won't need to display all at the same time). But at some time, the data to display have to be in memory, and so does the model.

            You just don't need your 10M rows in memory at the same time, so in short, import to memory just what has to be displayed, no more no less.

            You can use a direct distant database (SQL or any other type) or using an API (REST or GraphQL) depending on your needs.

            Christian EhrlicherC Offline
            Christian EhrlicherC Offline
            Christian Ehrlicher
            Lifetime Qt Champion
            wrote on last edited by
            #5

            Or simply filter before you add them to a model - noone is able to handle 10m rows in a useful way.

            Qt Online Installer direct download: https://download.qt.io/official_releases/online_installers/
            Visit the Qt Academy at https://academy.qt.io/catalog

            1 Reply Last reply
            1
            • Paul ColbyP Offline
              Paul ColbyP Offline
              Paul Colby
              wrote on last edited by
              #6

              @IGNB said in I use QAbstractItemModel add 10,000,000 row * 10 column datas,It consume 1G memory:

              how about put the model(which consume almost memory) into server,and the client is for display only。
              Is this feasible?

              Yes, this is kind of what @JonB was suggesting with the SQLite as the "server", even if it might be local.

              Disclaimer: much of the following is from fairly old memory, but should help steer you in the right direction, hopefully ;)

              The idea is that model tells the view that data is available, but doesn't actually fetch it from the source (could be a remote server, or a local SQLite, etc) unless the view asks for it.

              If the model knows how many rows there are (eg a SQL query result indicated that there's 10M rows) then QAbstractItemModel::rowCount() returns the total count (eg 10M), even though very little data (possibly none at all) has been fetched yet - only whatever the view needs to render its first view. In this scenario, the view's scrollbar (if it has one) scales indicate the 10M, so if the user drags it down to somewhere down the bottom, then the view asks the model for the items at that position, and the model fetches those rows (and often a few either side) from the server. (If I remember correctly, for this to be performant, you really want to set the ResizeMode of the view's vertical head to QHeaderView::Fixed)

              If the model can't tell in advance how many rows there will be in total (eg using a SQL query's scrollable cursor), then QAbstractItemModel::canFetchMore() returns true until there's no more rows available, and so if users keeps scrolling the view, the view will keep asking the model if there's more, requesting more via QAbstractItemModel::fetchMode(), and updating (re-scaling) the vertical scrollbar (if any) as the set of rows expands.

              Of course you can implement those behaviours yourself by sub-classing QAbstractItemModel, but the QSqlQueryModel class does something along those lines for you, depending on what the database engine supports. From the docs:

              If the database doesn't return the number of selected rows in a query, the model will fetch rows incrementally. See fetchMore() for more information.

              Either way, you might need to add some code to free and invalidate some of the earlier indexes that haven't been used in a while, if memory growth is an issue for you (how many people will actually scroll through all 10M rows? How much RAM can you depend on your users having?)

              Cheers.

              PS I did quite of bit of benchmarking of Qt's handing of millions of rows in models/views back in the Qt 4.3 era, and was frankly really impressed. That capability alone was the reason my employer chose to use Qt for that project (and a number of projects that followed it), and the reason I learned, and fell in love with Qt :D

              JonBJ 1 Reply Last reply
              3
              • Paul ColbyP Paul Colby

                @IGNB said in I use QAbstractItemModel add 10,000,000 row * 10 column datas,It consume 1G memory:

                how about put the model(which consume almost memory) into server,and the client is for display only。
                Is this feasible?

                Yes, this is kind of what @JonB was suggesting with the SQLite as the "server", even if it might be local.

                Disclaimer: much of the following is from fairly old memory, but should help steer you in the right direction, hopefully ;)

                The idea is that model tells the view that data is available, but doesn't actually fetch it from the source (could be a remote server, or a local SQLite, etc) unless the view asks for it.

                If the model knows how many rows there are (eg a SQL query result indicated that there's 10M rows) then QAbstractItemModel::rowCount() returns the total count (eg 10M), even though very little data (possibly none at all) has been fetched yet - only whatever the view needs to render its first view. In this scenario, the view's scrollbar (if it has one) scales indicate the 10M, so if the user drags it down to somewhere down the bottom, then the view asks the model for the items at that position, and the model fetches those rows (and often a few either side) from the server. (If I remember correctly, for this to be performant, you really want to set the ResizeMode of the view's vertical head to QHeaderView::Fixed)

                If the model can't tell in advance how many rows there will be in total (eg using a SQL query's scrollable cursor), then QAbstractItemModel::canFetchMore() returns true until there's no more rows available, and so if users keeps scrolling the view, the view will keep asking the model if there's more, requesting more via QAbstractItemModel::fetchMode(), and updating (re-scaling) the vertical scrollbar (if any) as the set of rows expands.

                Of course you can implement those behaviours yourself by sub-classing QAbstractItemModel, but the QSqlQueryModel class does something along those lines for you, depending on what the database engine supports. From the docs:

                If the database doesn't return the number of selected rows in a query, the model will fetch rows incrementally. See fetchMore() for more information.

                Either way, you might need to add some code to free and invalidate some of the earlier indexes that haven't been used in a while, if memory growth is an issue for you (how many people will actually scroll through all 10M rows? How much RAM can you depend on your users having?)

                Cheers.

                PS I did quite of bit of benchmarking of Qt's handing of millions of rows in models/views back in the Qt 4.3 era, and was frankly really impressed. That capability alone was the reason my employer chose to use Qt for that project (and a number of projects that followed it), and the reason I learned, and fell in love with Qt :D

                JonBJ Offline
                JonBJ Offline
                JonB
                wrote on last edited by JonB
                #7

                @Paul-Colby
                Everything you have written is correct/useful for the OP. However, there is one "wrinkle" which I discussed with someone else yesterday that you/the OP/a reader should be aware of:

                requesting more via QAbstractItemModel::fetchMode()
                but the QSqlQueryModel class does something along those lines for you
                you might need to add some code to free and invalidate some of the earlier indexes that haven't been used in a while, if memory growth is an issue for you

                There is a potential problem if you pick QSqlQueryModel. Please read post at https://forum.qt.io/topic/144535/qsqlquerymodel-incrementally-remove-old-rows/11

                QSqlQueryModel does not offer access to removing/reducing the rows previously read: you can fill it incrementally via fetchMore() but you cannot remove rows dynamically.
                QSqlQuery can be used instead by the OP to implement their own which does allow disposing of previously read rows.

                I 1 Reply Last reply
                2
                • JonBJ JonB

                  @Paul-Colby
                  Everything you have written is correct/useful for the OP. However, there is one "wrinkle" which I discussed with someone else yesterday that you/the OP/a reader should be aware of:

                  requesting more via QAbstractItemModel::fetchMode()
                  but the QSqlQueryModel class does something along those lines for you
                  you might need to add some code to free and invalidate some of the earlier indexes that haven't been used in a while, if memory growth is an issue for you

                  There is a potential problem if you pick QSqlQueryModel. Please read post at https://forum.qt.io/topic/144535/qsqlquerymodel-incrementally-remove-old-rows/11

                  QSqlQueryModel does not offer access to removing/reducing the rows previously read: you can fill it incrementally via fetchMore() but you cannot remove rows dynamically.
                  QSqlQuery can be used instead by the OP to implement their own which does allow disposing of previously read rows.

                  I Offline
                  I Offline
                  IGNB
                  wrote on last edited by
                  #8

                  @JonB I have seen a software developed in Delphi, that used only 10-30M of memory to query and display a Table of 100,000*100.
                  Only in the first few seconds before display, memory climbed to 200-300M, and then dropped after display.
                  And do sort, filter, search on that Table, memory is not very high.
                  can I use Qt also do the same?

                  JonBJ 1 Reply Last reply
                  0
                  • I IGNB

                    @JonB I have seen a software developed in Delphi, that used only 10-30M of memory to query and display a Table of 100,000*100.
                    Only in the first few seconds before display, memory climbed to 200-300M, and then dropped after display.
                    And do sort, filter, search on that Table, memory is not very high.
                    can I use Qt also do the same?

                    JonBJ Offline
                    JonBJ Offline
                    JonB
                    wrote on last edited by JonB
                    #9

                    @IGNB
                    That would imply the data is in an external database (on local disk, or on a connected server). You can doubtless achieve something like under Qt with QSql... classes, though as I wrote above you may have to use QSqlQuerys with your own model so that you can control memory usage rather than directly use a QSqlQueryModel. Sorting/filtering/search can be done most efficiently by passing a SQL SELECT statement with suitable parameters to the database level rather than reading all records in and doing it on an in-memory table yourself.

                    I 1 Reply Last reply
                    1
                    • JonBJ JonB

                      @IGNB
                      That would imply the data is in an external database (on local disk, or on a connected server). You can doubtless achieve something like under Qt with QSql... classes, though as I wrote above you may have to use QSqlQuerys with your own model so that you can control memory usage rather than directly use a QSqlQueryModel. Sorting/filtering/search can be done most efficiently by passing a SQL SELECT statement with suitable parameters to the database level rather than reading all records in and doing it on an in-memory table yourself.

                      I Offline
                      I Offline
                      IGNB
                      wrote on last edited by
                      #10

                      @JonB much appreciate your directions,I'll give it a try.

                      1 Reply Last reply
                      0

                      • Login

                      • Login or register to search.
                      • First post
                        Last post
                      0
                      • Categories
                      • Recent
                      • Tags
                      • Popular
                      • Users
                      • Groups
                      • Search
                      • Get Qt Extensions
                      • Unsolved