Database table transfer over TCP
-
@mjsurette
Thanks for the suggestion of using INSERT with a SELECT. I'll need to read more about this command strategy. It's a new method for me.@kshegunov
There's no issue. As you pointed out my method would be://Create the output stream QDataStream out(&buffer); //Create the database query object QSqlQuery dbQuery(dbData); //Execute the SQL command to retrieve the targeted data rows dbQuery.exec(selStr); //Loop through all rows/columns to extract the collected data to the output stream while(dbQuery.next()) { for(int i=0; i<dbQuery.record().count(); ++i) out << dbQuery.value(i); }
I was hoping more for a "plug'n play" approach. Instead of looping through all columns and rows. (i.e. out << dbQuery);
@VRonin
Thank you very much for sharing your method for Serializing QT Models. I'll look more into this idea. -
@DoughBoy said in Database table transfer over TCP:
I was hoping more for a "plug'n play" approach. Instead of looping through all columns and rows. (i.e. out << dbQuery);
You can do that yourself:
while(dbQuery.next()) { out << dbQuery; }
Where you define your own
operator <<
:QDataStream & operator << (QDataStream & out, const QSqlQuery & query) { if (!query.isValid() || !query.isActive()) return out; //< Handle the error accordingly. QSqlRecord record = query.record(); for(qint32 i = 0, count = record.count(); i < count; i++) out << record.value(i); //< You can even write the field names if you so desire. return out; }
-
@kshegunov did you miss a loop?
QDataStream & operator << (QDataStream & out, const QSqlQuery & query) { if (!query.isValid() || !query.isActive()) return out; //< Handle the error accordingly. while(query.next()){ const QSqlRecord record = query.record(); for(qint32 i = 0, count = record.count(); i < count; i++) out << record.value(i); //< You can even write the field names if you so desire. } return out; }
You probably need to serialise
count
too otherwise you wouldn't know how many columns there are -
@VRonin said in Database table transfer over TCP:
did you miss a loop?
Nope! See two lines above:
while(dbQuery.next())
:)
-
@VRonin said in Database table transfer over TCP:
I need a coffee ;)
@DoughBoy Sorry for the offtopic. I couldn't resist. ;)
-
@kshegunov
It's perfectly okay to get off topic. If you're not having fun, then you gota do something else!And yes, you are correct, I could define my own
operator <<
andoperator >>
. But as @VRonin pointed out I would need to serializecolumn count
androw count
to properly know how the data is getting disassembled - in order to reassemble the SQL data structure. I didn't make that point clear in my previous post. I was hoping to maintain that unknown structure with an actual bit-by-bit serialization of theQSqlQuery
stored in memory. I actually believe it would be best to work with theQSqlQueryModel
in that instance (maybe I'm wrong?).For simplicity, it seems @mjsurette has offered the approach I'm looking for, exporting the data set to a database file then to export that file (bit-by-bit) to the remote platform where I can then handle the SQL data collection in that manner. I could be mistaken, but I believe this approach would be much faster than disassembling the SQL data then reassembling it. Would you agree?
-
Like you've mentioned, my approach has the advantage of simplicity. It also keeps the data structure in the transferred file. Personally I'm a big believer in keeping things simple. It makes things more reliable, as in harder to mess up, and easier to troubleshoot if they go wrong.
The sqlite file format compresses fairly well. The Chinook_Sqlite.sqlite test database compresses from 1,067,008 bytes to 343,098 bytes using the default compression on my system.
As far as speed of the transfer, or any other parameter, modern systems are pretty fast and I would think that i/o would be the bottleneck. That loop might not be as bad as you think. So, I'd hate to assign any advantage to any method without first trying it.
Mike
-
@DoughBoy said in Database table transfer over TCP:
But as @VRonin pointed out I would need to serialize column count and row count to properly know how the data is getting disassembled - in order to reassemble the SQL data structure.
Yes you need the number of columns, their names, number of rows and the table name for each dataset you send (the set of rows). A package could look something like this (pseudo binary):
[TableName: QString] [ColumnsNumber: qint8] ColumnsNumber x [ColumnName: QString] [RowsNumber: qint32] RowsNumber x ColumnsNumber x [FieldValue: QVariant]
I was hoping to maintain that unknown structure with an actual bit-by-bit serialization of the QSqlQuery stored in memory.
Not possible. At least not to my knowledge.
I actually believe it would be best to work with the QSqlQueryModel in that instance (maybe I'm wrong?).
I think my way best, but I'm biased. In any case, I advise to use whatever you find most easy/convenient to implement, which ultimately means most easy to maintain!
I believe this approach would be much faster than disassembling the SQL data then reassembling it
Probably not. Even with memory based SQLite table the overhead would be higher, but again it depends on how much higher ... if it's 10% I say the hell with it, if it's 100% I say think twice. The problem is you can't really tell before implementing both approaches and benchmarking them ...
-
Hello everyone!
I've tried to implement both methods, but I've ran into a slight snag.@kshegunov
Your method is straight forward! With aQDataStream
writing to aQByteArray
, I first store the column headers (asQString
) and all data values (asQVarient
). Then to a parentingQByteArray
I keep track of the Database name, Table name, number of rows, number of columns, and the data collection'sQByteArray
. It is then this parentingQByteArray
I send to the remote platform for data extraction and storage. Working with 20 days of data, I was able to collect, transmit, and store the data in just about 4 minutes (not that this matters, but it's a way for me to compare the performance difference).@mjsurette
I'm having a difficult time getting your strategy implemented. I'm able to populate aQSqlTableModel
with my desired SQL query. I'm using the TableModel because it is said that this object can easily read/write to a database table. Sadly though, how do I then take the table's data and switch to a newQSqlDatabase
object (this is defined in theQSqlTableModel
's constructor)? The way we discussed this, I would have 2 differentQSqlDatabase
objects - 1) is the original data; 2) is the exported data. Could you offer some more guidance with how I can move data from one database to another?Thanks and I look forward to hearing from you.
-
@DoughBoy
The following SQL, with the Chinook test database open will create a new file in the same directory named 'out.sqlite' holding all of the album information for artist 51 with the original table layout.A Qt model is totally unnecessary. Just run the query and it will create your file. From there you can compress it and ship it.
ATTACH DATABASE 'out.sqlite' AS outdb; CREATE TABLE outdb.album AS SELECT * from album WHERE ArtistId = 51; DETACH DATABASE outdb;
It does use SQLite specific sql.
Note especially that the ATTACH DATABASE and DETACH DATABASE are sql to be sent to the database just like the CREATE TABLE..
BTW, I have never done this using Qt, only in batch files, but I see no reason for it not to work both ways.
Mike