Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
  • Search
  • Get Qt Extensions
  • Unsolved
Collapse
Brand Logo
  1. Home
  2. Qt Development
  3. General and Desktop
  4. QProcess scalability issue

QProcess scalability issue

Scheduled Pinned Locked Moved General and Desktop
3 Posts 2 Posters 1.7k Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • E Offline
    E Offline
    EpsilonZero
    wrote on last edited by
    #1

    I am writing a console application to manage anyting up to 100k processes (or as many as there is room for in memory) and I'm using QProcess to set up two-way communication with each process.

    • When I use any of the "waitFor.." methods in QProcess, such as "waitForStarted()" or "waitForReadyRead()", I get a buffer overflow error from Qtcore after starting around 200 processes.

    • When I don't use any of these methods, I am able to start 10k processes easily. I am also able to connect to the signals of these processes, so instead of explicitly calling "waitForReadyRead()" I can wait for the process to fire the signal "readyRead()", and then read the process output without problems.

    In other words, I've found a workaround (or maybe the proper solution), but it worries me that I have to be picky about which methods to use to avoid ugly crashes.

    Any insight would be appreciated.

    PS:
    On Linux and Mac the maximum open files pr. user has to be changed first, otherwise that will cause another (understandable) crash. In my case (on Ubuntu 12.4) I do this with ulimit -n 102400

    Example code:

    @
    #include <iostream>
    #include <QProcess>
    #include <vector>

    using namespace std;

    vector<QProcess*> procs;

    QString command="./myProcess";

    QStringList args;

    int process_count=200;

    int main(int argc, char** argv){

    if(argc>1)
    process_count=atoi(argv[1]);

    cout << "Setting up " << process_count << " processes" << endl;

    /*
    Create
    */
    for(int i=0;i<process_count; i++)
    procs.push_back(new QProcess);

    cout << "Created processes" << endl;

    /*
    Start
    /
    vector<QProcess
    >::iterator it;
    for(it=procs.begin(); it!=procs.end(); it++)
    (*it)->start(command,args);

    cout << "All started. Checking their output..." << endl ;

    /*
    Read
    */
    for(it=procs.begin(); it!=procs.end(); it++){
    (*it)->waitForStarted(); //~200 => CRASH
    (*it)->waitForReadyRead(); //~200 => CRASH
    }
    cout << "All processes responded. " << endl;

    /*
    Kill
    */
    for(it=procs.begin(); it!=procs.end(); it++)
    (*it)->kill();

    cout << "All killed. Done. "<< endl;

    }
    @

    Backtrace of core dump:

    @
    *** buffer overflow detected ***: ./qprocess_testing terminated
    ======= Backtrace: =========
    /lib/x86_64-linux-gnu/libc.so.6(__fortify_fail+0x37)[0x7f02694ea807]
    /lib/x86_64-linux-gnu/libc.so.6(+0x109700)[0x7f02694e9700]
    /lib/x86_64-linux-gnu/libc.so.6(+0x10a7be)[0x7f02694ea7be]
    /usr/lib/x86_64-linux-gnu/libQtCore.so.4(+0x1554d4)[0x7f0269e0d4d4]
    ./qprocess_testing[0x400e35]
    /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed)[0x7f026940176d]
    ./qprocess_testing[0x400fa9]
    @

    And the code of the dummy process "myProcess":

    @
    #include <iostream>

    using namespace std;

    void work(){
    for(int i=0;i<200000000;i++){
    if(false){ //Dirty hack to avoid compiler optimization
    cout << "No way: " << i << endl;
    }
    }
    return;
    }

    int main(){

    sleep(1);
    cout << "!";

    char c;
    while(true){

    cin >> c;
    work();
    cout <<++c;
    

    }

    }

    @

    1 Reply Last reply
    0
    • S Offline
      S Offline
      Statix
      wrote on last edited by
      #2

      Did you ever find a solution to this problem, I'm running 5.2.1 and QProcess seems to freak out once the app has created 200ish.

      It didn't have any problem running 10300 threads so what is going on here.

      1 Reply Last reply
      0
      • E Offline
        E Offline
        EpsilonZero
        wrote on last edited by
        #3

        Hi, I was never able to get to any scale with waitForStarted() and waitForReadyRead() so I as far as I can tell this has to be a scalability limitation or a bug. However, when I dropped those waitFor.. functions and just went with good old signals and slots I got up >20K processes which solved my problem. Unfortunately I don't have any concise code example ready at hand, but I ended up writing a monitor class that subscribed to the readyRead signal from each QProcess. On that signal I would bite off whatever was in their buffers, which gave me all the outputs from all the processes and actually turned out quite nicely (I just had to deal with concurrency a bit more). I also did a similar thing for the inverse case of writing to the QProcesses, so this works fine both ways.

        So, my conclusion: Still a bug, but restricted to those particular functions. If you can live without them, you're probably in for a few more lines of code, but maybe a better solution overall.

        1 Reply Last reply
        0

        • Login

        • Login or register to search.
        • First post
          Last post
        0
        • Categories
        • Recent
        • Tags
        • Popular
        • Users
        • Groups
        • Search
        • Get Qt Extensions
        • Unsolved