Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
  • Search
  • Get Qt Extensions
  • Unsolved
Collapse
Brand Logo
  1. Home
  2. Qt Development
  3. General and Desktop
  4. Performance problem filling a QJsonArray
Forum Updated to NodeBB v4.3 + New Features

Performance problem filling a QJsonArray

Scheduled Pinned Locked Moved Unsolved General and Desktop
4 Posts 3 Posters 541 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • SeDiS Offline
    SeDiS Offline
    SeDi
    wrote on last edited by
    #1

    I have about 10k rows of data and the code below needs about 350 ms to write all this into a QJsonArray. It would really help to bring that down to 150-200 ms.

    I could solve a similar problem with multithreading (see this thread, thanks to @JonB!)

    Unfortunately I had to find out that I cannot use the same approach here, because now it's a QJsonArray , that cannot be pre-allocated and is not thread-safe. It seems I can't easily append two QJsonArrays as well, so I don't see any means to do it in parallel.

    Any ideas on how it's possible to considerabily speed this up?

    bool LogTableModel::saveDataToFile(QString fileName)
    {
        QElapsedTimer t;
        t.start();
            
        QJsonArray productsArray;
        
        for(int i = 0; i < this->rowCount(); i++) {
            QJsonObject itemObj;
    
            itemObj["uid"] = QString::number(this->m_columns.at(m_uidColumn)->at(i).value<qint32>());
            itemObj["dateAssigned"] = this->m_columns.at(m_dateAssignedColumn)->at(i).toDate().toString(this->dateFormat);
            itemObj["dateTimeEntry"] = this->m_columns.at(m_dateTimeEntryColumn)->at(i).toDateTime().toString(this->dateTimeFormat);
            itemObj["name"] = this->m_columns.at(m_nameColumn)->at(i).toString();
             // + 12 more entries
    
            productsArray.append(itemObj);
        }
    
        QJsonObject saveProductsObj;
        saveProductsObj.insert("logged_data", productsArray);
    
        qDebug() << "needed " << t.elapsed() << " ms for filling the array";
    
    
        QFile saveFile(fileName);
        if (!saveFile.open(QIODevice::WriteOnly)) {
            qWarning("Couldn't open log file ");
            return false;
        }
        saveFile.write(QJsonDocument(saveProductsObj).toJson(QJsonDocument::Indented);
        saveFile.close();
        return true;
    }
    

    Qt 5.15.2, Win64, compiling for Android

    JonBJ 1 Reply Last reply
    0
    • D Offline
      D Offline
      DerReisende
      wrote on last edited by
      #2

      I one of my projects I found that QDate conversion operations were really slow and I ended up using a lookup cache to check if I had already converted one item before. In my use case the cache gave a 30x speedup instead of calling QDate functions for each row.
      If your dateAssigned and dateTimeEntry don‘t change much maybe this approach is worth a try and will give you the speedup needed.

      SeDiS 1 Reply Last reply
      1
      • SeDiS SeDi

        I have about 10k rows of data and the code below needs about 350 ms to write all this into a QJsonArray. It would really help to bring that down to 150-200 ms.

        I could solve a similar problem with multithreading (see this thread, thanks to @JonB!)

        Unfortunately I had to find out that I cannot use the same approach here, because now it's a QJsonArray , that cannot be pre-allocated and is not thread-safe. It seems I can't easily append two QJsonArrays as well, so I don't see any means to do it in parallel.

        Any ideas on how it's possible to considerabily speed this up?

        bool LogTableModel::saveDataToFile(QString fileName)
        {
            QElapsedTimer t;
            t.start();
                
            QJsonArray productsArray;
            
            for(int i = 0; i < this->rowCount(); i++) {
                QJsonObject itemObj;
        
                itemObj["uid"] = QString::number(this->m_columns.at(m_uidColumn)->at(i).value<qint32>());
                itemObj["dateAssigned"] = this->m_columns.at(m_dateAssignedColumn)->at(i).toDate().toString(this->dateFormat);
                itemObj["dateTimeEntry"] = this->m_columns.at(m_dateTimeEntryColumn)->at(i).toDateTime().toString(this->dateTimeFormat);
                itemObj["name"] = this->m_columns.at(m_nameColumn)->at(i).toString();
                 // + 12 more entries
        
                productsArray.append(itemObj);
            }
        
            QJsonObject saveProductsObj;
            saveProductsObj.insert("logged_data", productsArray);
        
            qDebug() << "needed " << t.elapsed() << " ms for filling the array";
        
        
            QFile saveFile(fileName);
            if (!saveFile.open(QIODevice::WriteOnly)) {
                qWarning("Couldn't open log file ");
                return false;
            }
            saveFile.write(QJsonDocument(saveProductsObj).toJson(QJsonDocument::Indented);
            saveFile.close();
            return true;
        }
        

        Qt 5.15.2, Win64, compiling for Android

        JonBJ Offline
        JonBJ Offline
        JonB
        wrote on last edited by
        #3

        @SeDi
        Thoughts:

        • Certainly see whether there is anything in @DerReisende's comment about QDates, I don't know. Try your saving without QDate columns, or a single, fixed, pre-calculated value, to see if that makes a difference?

        • In the code you show only the productsArray.append(itemObj); should be an issue if multiple threads. All the lines filling QJsonObject itemObj; should be quite independent in separate threads. If you use threads and only put a mutex lock around that line does it improve?

        • 350ms for 100k rows, which will produce many times that size of JSON lines. Is that unreasonable? I don't know.

        1 Reply Last reply
        0
        • D DerReisende

          I one of my projects I found that QDate conversion operations were really slow and I ended up using a lookup cache to check if I had already converted one item before. In my use case the cache gave a 30x speedup instead of calling QDate functions for each row.
          If your dateAssigned and dateTimeEntry don‘t change much maybe this approach is worth a try and will give you the speedup needed.

          SeDiS Offline
          SeDiS Offline
          SeDi
          wrote on last edited by SeDi
          #4

          @DerReisende
          Thank you very much for your hint! I have declared two private class members:

          QHash<QDate, QString> m_dateCache;
          QHash<QDateTime, QString> m_dateTimeCache;
          

          In my implementation I alternate between method calls that use cache lookup and those just doing the conversion.

          bool LogTableModel::saveDataToFile(QString fileName, bool useCache)
          

          My actual use of the cache is here:

                  if (useCache) {
                      QDate d = this->m_columns.at(m_dateAssignedColumn)->at(i).toDate();
                      QString sDate = m_dateCache.value(d,QString());
                      if (sDate.isEmpty()) {
                          sDate = d.toString(this->dateFormat);
                          m_dateCache.insert(d,sDate);
                      }
                      itemObj["dateAssigned"] = sDate;
                  } else {
          
                      itemObj["dateAssigned"] = this->m_columns.at(m_dateAssignedColumn)->at(i).toDate().toString(this->dateFormat);
                  }
          

          The results show, that with cacheAlternate == true the code is a tad quicker indeed, but only by some 3% (not counting the slower initial run). Using the same approach on the DateTime field actually makes the code slower, for whatever reason. Probably because I never have two identical DateTimes, so the QHash becomes rather big.

          I have tried to cache the QVariant instead, to get rid of the "toDate()":

          QHash<QVariant, QString> m_dateCache;
          

          but my lookup

          QString sDate = m_dateCache.value(v,QString());
          

          makes the compiler claim about "no matching function call for qHash", QVariant does not seem to have a proper qHash() function for this - and I wouldn't expect any magical acceleration here either.

          But even if it's not a huge boost, I've learned much from this hint. Thank you very much for the input!
          I'll try the mutex idea next.

          1 Reply Last reply
          0

          • Login

          • Login or register to search.
          • First post
            Last post
          0
          • Categories
          • Recent
          • Tags
          • Popular
          • Users
          • Groups
          • Search
          • Get Qt Extensions
          • Unsolved