Unsolved QSqlQuery crashes when inserting large bytea data
-
hi there, I have a application with qt 5.6.3 and postgresql v10 db, I have plenty of queries in my app that store/retrieve data from the db. Most of them are in blob/bytea and that work fine for small sizes < 10 MB but now I want to save 150 MB in one record field instead of 10 MB. If I try to store the 150 MB data as I do with the 10 MB, I get a bad allocation crash in QSqlQuery::exec(). I use this method in plenty other locations in the application without any issues except this time the blob is 150 MB.
The limitation of PostgreSQL for a bytea field is 1 GB so I'm well under that limit. The machine I'm running this on has 16 GB of which 10 is free when the code is run. Any one an idea on what is going on? -
Hi and welcome to devnet,
Are you building that application in 64bit mode ?
-
Hi SGaist, thanks for responding. No, it's a 32bit application. If I chop the data in pieces ( 36 MB ) it seems to be stored but retrieval is not working and let me reassure you that the query and the mechanism to retrieve the data from the database works. I just can't seem to understand why this is happening. I have also looked at the configuration of the postgresql db and tweaked it accordingly ( eg. increased the shared_buffers to 512 MB )
-
Sounds like you are hitting the 2Gb limit. Did you check the memory usage of your application ? Are you at the limit when doing that big save to the database ?
-
150MB QByteArray will become 600MB char* when inserting into a postgresql database (due to escaping etc) when I remember correctly. Make sure you have that amount of contiguous memory available. Also please post a proper backtrace so we can see if it really crashes inside the QPsql code.
-
sorry guys, I was caught up in something else but thanks for replying. I have chopped of the data in small pieces ( ~35 MB ) and tried to upload it via different threads. Only the first file gets uploaded to the database, but when I do this sequentially all of them get uploaded. I'm using std::future and currently only allow 2 thread at the same time. Below I have tried to give you a concise overview of my code, I read files from disks (movieFileList) and put them in a bytearray (base64) and upload them in the database. I can tell you that single upload to the database works because I use is every where in my application but the parallel upload doesn't work. For each thread I create a new connection (db.cpp ) which I close in the thread when I'm done. Can anyone see what I'm doing wrong?
mediafile.cpp std::vector< QString > movieFileList; std::vector< std::future > futures; for( auto filename : movieFileList ) { std::future< bool > futureDb = std::async( std::launch::async, [ & ]()->bool { QFile movie( filename ); bool result( false ); if( !movie.exists()) return result; MediaData mediaData; mediaData.m_fileId = fileId; <-- int mediaData.m_typeId = m_colorType; <-- enum mediaData.m_creationtime = QDateTime::currentDateTime(); movie.open( QIODevice::ReadOnly ); mediaData.m_movie = std::move( movie.readAll().toBase64()); result = DAL::getInstance().Media()->addMediaRecord( mediaData ); movie.close(); Db::db()->database()->close(); return result; }); futures.push_back( std::move( futureDb )); while( futures.size() > 1 ) { for( auto& future : futures ) { if( future.wait_for( std::chrono::seconds(0)) == std::future_status::ready ) { std::swap( future, futures.back()); futures.pop_back(); break; } } } } ...... db.cpp static QThreadStorage< Database* > s_databaseConnections; QAtomicInt connectionCounter(0); Db* Db::m_instance; Db* DbSchema::m_db; void Db::createInstance(Db* db) { m_instance = db; } Db::Db(const char* name) : m_name(name), m_database( new Database( name ) ) { if( !s_databaseConnections.hasLocalData() ) s_databaseConnections.setLocalData( m_database.get() ); } Db::~Db() {} Database* Db::database() const { if( s_databaseConnections.hasLocalData()) { return s_databaseConnections.localData(); } else { QString connectionName = QString("conn_%1").arg(++connectionCounter); QSqlDatabase dbConnection = QSqlDatabase::cloneDatabase(m_database->sqlDb(), connectionName); Database* database = new Database( QString::fromStdString( m_name ), dbConnection ); database->open(); s_databaseConnections.setLocalData( database ); return database; } } ...... tblmedia.cpp int TblMedia::addMediaRecord(const MediaData &file ) { std::shared_ptr< SqlQueryStatement > statement = SqlQueryExecutor::execute( "INSERT INTO media ( fileid, typeid, , movie, creationtime ) " "VALUES (?,?,?,? )", SqlQueryData().add( file.m_fileId ).add( file.m_typeId ).add( file.m_movie ).add( file.m_creationtime )); return statement->getResult() ? statement->getLastInsertedId() : -1; }
-
You must not use a QSqlDatabase connection in a thread other than the one you created it in.
-
Christian, you're absolutely right and that's why I try to create a connection in the thread that I start. I hoped by including the partial code of db.cpp it would illustrate that. I used the threadstorage class to create and hold the connection data. But what I see is that the future is in the ready status while the upload is not done, any ideas?
-
@xyfix I've overseen s_databaseConnections.
You should try to make it work without threads first anyway.
-
Yes, I did and that works ( just takes ages to upload :-) ) but eventually everything gets uploaded to the db. I have checked with octet_length whether the blob size in postgresql is correct and it is. I have also check with qDebug statements and saw that new connections are made to the database for each thread started.
-
One more thing - cloneDatabse had some issues with the threadsafety. Can you, for testing, create a completely new db connection?
-
Ok , I have replaced the cloneDatabase() with this . but no success.
QString connectionName = QString("conn_%1").arg(++connectionCounter); QSqlDatabase dbConnection = QSqlDatabase::addDatabase("QPSQL", connectionName ); dbConnection.setDatabaseName( m_database->sqlDb().databaseName()); dbConnection.setHostName( m_database->sqlDb().hostName()); dbConnection.setPort( m_database->sqlDb().port()); dbConnection.setUserName( m_database->sqlDb().userName()); dbConnection.setPassword( m_database->sqlDb().password());
edit: I have added some more qDebug statements to make sure that I'm sending blob data and i have verified that it's sending. But the file does not reach the database. of the 20 files in the for loop only 5 entries reach the database and they are all 0 Kb.
-
@xyfix said in QSqlQuery crashes when inserting large bytea data:
DAL::getInstance().Media()->addMediaRecord( mediaData );
Are you sure you use the correct db connection here?
-
@Christian-Ehrlicher , after splitting the large file into smaller pieces I don't get any crashes anymore, it's just that the parallel version does not upload any files ( all are 0 Kb) while the sequential version uploads all correctly. I have put a qDebug statement in the function that is called by SqlQueryExecutor::execute() and the database is the newly created one "conn_3", "conn_4".....
std::shared_ptr<SqlQueryStatement> SqlQueryExecutor::execute( const QString& query, const SqlQueryData& sqlData) { qDebug() << "execute db " << Db::db()->database()->sqlDb().connectionName(); std::shared_ptr< SqlQueryStatement > statement( new SqlQueryStatement( Db::db()->database(), query, sqlData )); return statement; }
-
@xyfix said in QSqlQuery crashes when inserting large bytea data:
DAL::getInstance().Media()->addMediaRecord( mediaData );
I still don't understand how this function can fetch the thread local database since you don't pass it - how do you access the db in there?
-
@Christian-Ehrlicher, I hope this explains a couple of things along with the previous posted code. The Database class has the SqlDatabase object
dal.h
static DAL& getInstance() { static DAL instance; return instance; } TblMedia* Media() { return m_dbMedia.get(); } std::unique_ptr< TblMedia > m_dbMedia;
dal.cpp
DAL::DAL() { m_dbMedia.reset( new TblMedia()); }
tblmedia.h
class TblExaminationMedia { public: explicit TblMedia(); int addMediaRecord( const MediaData& file ); }
tblmedia.cpp
int TblExaminationMedia::addExaminationMediaRecord(const ExaminationMediaData &file ) { std::shared_ptr< SqlQueryStatement > statement = SqlQueryExecutor::execute( "INSERT INTO media ( fileid, typeid, movie, creationtime ) " "VALUES (?,?,?,? )", SqlQueryData().add( file.m_fileId ).add( file.m_typeId ).add( file.m_movie ).add( file.m_creationtime )); return statement->getResult() ? statement->getLastInsertedId() : -1; }
sqlqueryexecutor.h
std::shared_ptr< SqlQueryStatement > execute( const QString& query, const SqlQueryData& sqlData = SqlQueryData());
sqlqueryexecutor.cpp
std::shared_ptr<SqlQueryStatement> SqlQueryExecutor::execute( const QString& query, const SqlQueryData& sqlData) { std::shared_ptr< SqlQueryStatement > statement( new SqlQueryStatement( Db::db()->database(), query, sqlData )); return statement; }
-
Looks a little bit over-designed but in theory it's fine. I would start using a memory analyzer like e.g. valgrind. Or really use a simple QSqlQuery + bindValue() in the thread just to be sure.
-
Ok, keep you updated. I also debugged all the way when qsqlquery.exec() is called and looked at the database connection that is used and it's the right one "conn_3" or "conn_4".... so on connection level it looks ok
-
I may have missed it but are you sure you are creating the connections within the thread and not before ?
-
@SGaist It should - it's a QThreadLocal storage.