Yet another QProcess write issue

  • I saw several questions here and on SO but I still don't understand my mistake.
    My Qt5 console application should exchange asynchronous messages with another console application (written in plain C/C++). I don't have its code, but if I launch it from a Linux console and I type a specific command on stdin, I get an answer.

    Here what I do:

    QProcess *process; // defined in header
    // in MyApp class constructor
    process = new QProcess();
    connect(process, &QProcess::readyReadStandardOutput, this, &MyApp::readyReadStandardOutput);
    void MyApp::send(QByteArray data)
    void MyApp::readReadStandardOutput()
        qDebug() << process->readAllStandardOutput();

    I try to call send with the very same command I typed before, but I receive nothing in return - I'm afraid the problem is in the write.

    Is there something wrong here?

  • Lifetime Qt Champion


    You are not checking that your application started correctly nor did you check the standard error for anything showing there.

    You can use errorOccurred and readyReadStandardError.

  • @SGaist I simplified a lot the whole class. Actually I catch the stateChanged signal and I wait until is running before write anything. Then I double check the send function:

    if (process->state() == QProcess::Running && process->isWritable)
        qDebug() << process->write(data);

    and I get the correct number of bytes written.
    I also monitor the stderr and errorOccurred signal...
    It's not the first time I use QProcess but now I'm stuck.

    I wonder if it's correct to "emulate" a keyboard typing in that way.


    Calling a lot of times the send function, finally I got all the answers at once.
    So I bet there's something wrong with the buffers...

  • @Mark81

    Calling a lot of times the send function,

    How did you code that? In a tight loop? On a timer? Was the Qt main event loop allowed to run? That might make a difference.

  • @JonB I call send() in the slot function of a QTimer running at 1 second.
    But I guess the problem is the external C application doesn't flush the stdout after the printf. So in terminal all works fine, but when using pipes the buffers prevents the communication (until they are filled).

    Because I have no control on the external application, is there a way to tell QProcess to run unbuffered on both read and write?

  • @Mark81
    That. I'm afraid, is correct.

    If the other side uses, say, printf(), it will treating stdout as fully buffered, as though to a file, as opposed to line buffered, which would be in effect for a terminal, but it won't see the pipe connection as that. So unless it is a "logging" program which is flushing its output all the time, you are likely not to receive anything till it has output perhaps 4k bytes, or maybe 512? Is that the sort of chunk size you are seeing?

    BTW, you haven't said whether you are Linux or Windows. Not that there's a solution, just possibly different behaviour.

  • I wrote it, but it's quite hidden :-) I'm working on Linux.
    The buffer seems to fill after 3k (27 packets of 114 bytes each). Weird.

    I also tried to start the process using QIODevice::ReadWrite | QIODevice::Unbuffered but the behavior is the same.

  • @Mark81
    I had already looked up the "unbuffered", and seen report it did not help:, and the first comment to that.

    There is also some debate over exactly when your write() will actually send its data, and you can't do anything about that, so there could be a delay there, though the chunkiness would imply buffering at the other end. You don't have control of that end's code, do you?

    As for Linux, I was reading about someone trying to do pseudo-ttys from QProcess to fake terminal so that the other end would at least maybe line buffer instead of size buffer, but do you really want me to dig that out so that you can implement? :)

  • @JonB said in Yet another QProcess write issue:

    As for Linux, I was reading about someone trying to do pseudo-ttys from QProcess to fake terminal so that the other end would at least maybe line buffer instead of size buffer, but do you really want me to dig that out so that you can implement? :)

    Only if it takes 1 minute to find out a link :-)
    Otherwise it's enough to put me on the right way using the correct words so I can do the search by myself!

  • @Mark81
    I think the one I recall was There's not much in it. He confirms what we have been saying here. He simply mentions at one point

    If you were on Unix, you could force the behaviour by using a pseudoterminal

    connection. Qt wouldn't help you here, but it would be possible.

    So actually I think he's saying Qt would get in your way, possible but not pleasant.

    Really your issue has nothing to do with Qt. The problem is that the other end is buffering output which you need immediately, and that's down to the other end. You're supposed to write to the authors and ask them to flush more often/unbuffer for you. :)

    BTW, you should be able to observe the other end's behaviour wrt buffering outside of Qt frpom the terminal via something like echo your-commands | otherapp | cat -u. That piping of its output should be block-buffered, as from QProcess. While from a terminal echo your-commands | otherapp should only be line-buffered, if it's outputting \ns you would see a difference.

    Have a read of This should be your situation (assuming the write of commands to txhe other process is not at issue). First you'll see confirmation of what we're saying. Then you'll see a couple of possibilities, that stdbuf might help you....?

  • @JonB Got it. Thanks very much for the time spent to make things more clear!

Log in to reply