All,
I'm in the process of converting my desktop application from a local SQL server to a cloud based SQL server in Azure and having some performance issues with the switch.
On the local server during testing there are no performance issues. After moving the database to the cloud though I'm noticing a few issues with just the test data (5 tables, each with around 15-25 columns and 30 records in each).
Each table is accessed via a custom widget in a QMdiSubWindow. The custom widgets are primarily composed of:
QSqlTableModel with an OnManualSubmit edit strategy
QTableView (Green)
QDataWidgetMapper (Blue)
In certain cases, a secondary QTableView lookup based on the primary selection (Red)
[image: e0a685c2-94b9-40fb-8f1e-eef45ccd765d.png]
From researching similar posts such as this one it appears that my issue is related to the QTableView connections and volume of data() calls on the "live" model when gaining/losing focus, resizing, moving, etc.
My questions are:
Is this an application design issue? i.e. should the database not be accessed in the cloud? If so, what approaches have others used for database access from multiple locations?
If cloud access is fine (albeit with some additional security concerns) how have others implemented a cached model to get around these performance issues as suggested by @SGaist and @MrShawn in the linked post above?
My first thought here was to create a QStandardItemModel class that gets populated on initial load via a QSqlQuery with setForwardOnly set to true. Keep track of changes (UPDATE, INSERT, DELETE, etc.) in a cache and batch the uploads to the cloud. Is this on track?
My connection is 150 mpbs down/50 mpbs up and I'm running 6.8.1 on Qt Creator 15.0.0.
Thanks,
Zach