⚠️ Forum Maintenance: Feb 6th, 8am - 14pm (UTC+2)

QProcess scalability issue

  • I am writing a console application to manage anyting up to 100k processes (or as many as there is room for in memory) and I'm using QProcess to set up two-way communication with each process.

    • When I use any of the "waitFor.." methods in QProcess, such as "waitForStarted()" or "waitForReadyRead()", I get a buffer overflow error from Qtcore after starting around 200 processes.

    • When I don't use any of these methods, I am able to start 10k processes easily. I am also able to connect to the signals of these processes, so instead of explicitly calling "waitForReadyRead()" I can wait for the process to fire the signal "readyRead()", and then read the process output without problems.

    In other words, I've found a workaround (or maybe the proper solution), but it worries me that I have to be picky about which methods to use to avoid ugly crashes.

    Any insight would be appreciated.

    On Linux and Mac the maximum open files pr. user has to be changed first, otherwise that will cause another (understandable) crash. In my case (on Ubuntu 12.4) I do this with ulimit -n 102400

    Example code:

    #include <iostream>
    #include <QProcess>
    #include <vector>

    using namespace std;

    vector<QProcess*> procs;

    QString command="./myProcess";

    QStringList args;

    int process_count=200;

    int main(int argc, char** argv){


    cout << "Setting up " << process_count << " processes" << endl;

    for(int i=0;i<process_count; i++)
    procs.push_back(new QProcess);

    cout << "Created processes" << endl;

    >::iterator it;
    for(it=procs.begin(); it!=procs.end(); it++)

    cout << "All started. Checking their output..." << endl ;

    for(it=procs.begin(); it!=procs.end(); it++){
    (*it)->waitForStarted(); //~200 => CRASH
    (*it)->waitForReadyRead(); //~200 => CRASH
    cout << "All processes responded. " << endl;

    for(it=procs.begin(); it!=procs.end(); it++)

    cout << "All killed. Done. "<< endl;


    Backtrace of core dump:

    *** buffer overflow detected ***: ./qprocess_testing terminated
    ======= Backtrace: =========

    And the code of the dummy process "myProcess":

    #include <iostream>

    using namespace std;

    void work(){
    for(int i=0;i<200000000;i++){
    if(false){ //Dirty hack to avoid compiler optimization
    cout << "No way: " << i << endl;

    int main(){

    cout << "!";

    char c;

    cin >> c;
    cout <<++c;




  • Did you ever find a solution to this problem, I'm running 5.2.1 and QProcess seems to freak out once the app has created 200ish.

    It didn't have any problem running 10300 threads so what is going on here.

  • Hi, I was never able to get to any scale with waitForStarted() and waitForReadyRead() so I as far as I can tell this has to be a scalability limitation or a bug. However, when I dropped those waitFor.. functions and just went with good old signals and slots I got up >20K processes which solved my problem. Unfortunately I don't have any concise code example ready at hand, but I ended up writing a monitor class that subscribed to the readyRead signal from each QProcess. On that signal I would bite off whatever was in their buffers, which gave me all the outputs from all the processes and actually turned out quite nicely (I just had to deal with concurrency a bit more). I also did a similar thing for the inverse case of writing to the QProcesses, so this works fine both ways.

    So, my conclusion: Still a bug, but restricted to those particular functions. If you can live without them, you're probably in for a few more lines of code, but maybe a better solution overall.

Log in to reply