QProcess -- working QIODevice::Unbuffered, or adjustable buffersize for faster access to stdout
-
I've written some Qt code that monitors a long-term computation. The calculation is run via a QProcess instance, and outputs a status update each calculation cycle. The update is 1-4 80 char lines, and these happen every 30-60 seconds.
I need to be able to monitor the status outputs and handle their content quickly (e.g. restart with modified inputs if thing start going badly), but currently QProcess seems to wait until a buffer somewhere fills before making the output available, as it is sometimes several minutes before I see any output, at which point several dozen updates become available at once. When the calculation is run in a terminal, the updates appear smoothly, one at a time.
Is there some way to get access to the process's stdout sooner? From the docs and a quick skim of the sources, this doesn't look possible at the moment. Does anyone know of a possible way around this? Or can someone confirm that there is no current way to do this so I can open a bug report? ;-)
-
I'm running QProcess in a background thread, so it's easier to just spin on waitForReadyRead since it won't interfere with the GUI:
@
m_process = new QProcess(this);
m_process->setReadChannel(QProcess::StandardOutput);
m_process->start(command, QProcess::Unbuffered | QProcess::ReadWrite);
m_process->waitForStarted(-1);while (canGetInput()) { m_process->write(getInput()); } m_process->closeWriteChannel(); m_process->waitForReadyRead(-1); while (!m_process->atEnd() || m_process->state() == QProcess::Running) { if (m_process->waitForReadyRead(5000)) { while (m_process->canReadLine()) { parseOutput(QString(m_process->readLine())); } } qDebug() << QString("Current process state: (%1) PID: (%2) error: (%3) bytes Available (%4)") .arg(m_process->state()) .arg(m_process->pid()) .arg(m_process->error()) .arg(m_process->bytesAvailable()); }
@
bytesAvailable usually returns 0 right until it dumps several minutes worth of data at once.
-
Yes, but that won't solve my problem. The way my code is written it's easier to just spin, rather than do signals/slots. Just out of curiosity, I hooked up a signal to the readRead signal that writes to qDebug when the signal is emitted. The signals aren't emitted any more frequently. I need some way to access the output of the QProcess sooner.
-
Have you read the documentation? It states:
bq. bool QIODevice::canReadLine () const [virtual]
Returns true if a complete line of data can be read from the device; otherwise returns false.
Note that unbuffered devices, which have no way of determining what can be read, always return false.
This function is often called in conjunction with the readyRead() signal.I f you look at the code:
@
m_process->start(command, QProcess::Unbuffered | QProcess::ReadWrite);
@Your device is uinbuffered, so canReadLine always returns false....
-
adtrom: Nope -- I think the best solution here would be to add a flush() method to QProcess so that we could pull results as we needed them.
The trolls claim that this is a user issue, not a bug. It is definitely not a user issue -- there is nothing I can do to get the data faster, I'm stuck waiting on the buffer to fill. I had opened a bug here: http://bugreports.qt.nokia.com/browse/QTBUG-14503 Feel free to add your own comments or open a new one referencing it.
As for my problem, I just ended up reimplementing the functionality in the external program I was calling. Still working on it now, actually...Sure would have been nice to just be able to call a QProcess!
Gerolf: Actually it will return true at times (IIRC, Unbuffered mode isn't actually implemented in QProcess, so that flag really doesn't do anything). It was just one of many things I was trying to get this to work. Besides, bytesAvailable() should return non zero in Unbuffered mode when data is available, so I could spin on that instead. In any case, I tried passing every relevant combination of flags I could think of, and none worked the way I needed. There just needs to be a way to flush the pipe.
-
dlonie: Are you sure this is not a user bug? We are using QProcess (via a wrapper) all over the place in Qt Creator and do not have the issue you describe.
-
Just to be perfectly clear, QProcess does "work" in that it captures everything written on stdout by the external process, the problem is one of timing. In my case the external process writes a single line every few seconds, and I want to get this output as soon as it's available. What happens is that the output is buffered inside QBuffer and my program is not notified that output is available until many lines have been written (note that calling canReadLine explicitly makes no difference). I'm pretty sure that the issue is one of buffering. (caveat: i'm no expert) IE QProcess can set the buffering on the file descriptor it holds to whatever it wants (the fd connected to the external process's stdout). If it's small enough the external process may block more (waiting for QProcess to empty the buffer), but that's what we want in cases where getting the feedback is more important. I would expect the Unbuffered flag to accomplish this, but it seems to have no effect.
In my case the external program is not one I've written or can recompile, but I've managed to work around the problem by causing it to generate much more output. Now the lines I'm interested in are mixed in with a ton of other crap, but the larger volume of output causes readyRead() to be called more frequently (as the internal QProcess buffer fills up, I'd imagine).
Hope that clarifies ...
-
Gerolf: That's a good point about Unbuffered and canReadLine(). In fact if Unbuffered were working, you might expect that behavior. IE QProcess would hold perhaps one byte, the external process would block because QProcess's buffer was full, and canReadLine() would never return true (unless that one char happened to be a newline, I guess). So the fact that adding the Unbuffered flag to the start call makes no difference is more evidence that it's broken, in my opinion.
I think dlonie's suggestion is a good one. If the buffer in QBuffer is large (as now) then the external process won't block, but if we could get any output whenever we wanted by calling flush, we could set the frequency to whatever is appropriate for the latency requirements of our application.
-
-
dlonie, also redirecting stdout could be of some use
-
@template<class Elem = char, class Tr = std::char_traits<Elem>>
class StdRedirector : public std::basic_streambuf<Elem, Tr>
{
/**- Callback Function.
/
typedef void (pfncb)(const Elem, std::streamsize _Count, void pUsrData);
public:
/**- Constructor.
- @param a_Stream the stream to redirect
- @param a_Cb the callback function
- @param a_pUsrData user data passed to callback
/
StdRedirector(std::ostream& a_Stream, pfncb a_Cb, void a_pUsrData):
m_Stream(a_Stream), m_pCallbackFunction(a_Cb), m_pUserData(a_pUsrData)
{
//redirect stream
m_pBuffer = m_Stream.rdbuf(this);
};
/**
- Destructor.
- Restores the original stream.
*/
~StdRedirector()
{
m_Stream.rdbuf(m_pBuffer);
}
/**
- Override xsputn and make it forward data to the callback function.
/
std::streamsize xsputn(const Elem _Ptr, std::streamsize _Count)
{
m_pCallbackFunction(_Ptr, _Count, m_pUserData);
return _Count;
}
/**
- Override overflow and make it forward data to the callback function.
*/
typename Tr::int_type overflow(typename Tr::int_type v)
{
Elem ch = Tr::to_char_type(v);
m_pCallbackFunction(&ch, 1, m_pUserData);
return Tr::not_eof(v);
}
protected:
std::basic_ostream<Elem, Tr>& m_Stream;
std::streambuf* m_pBuffer;
pfncb m_pCallbackFunction;
void* m_pUserData;
};...
void outcallback(const char* ptr, std::streamsize count, void* pTextEdit)
{
(void) count;
QTextEdit* p = static_cast<QTextEdit*>(pTextEdit);
p->append(ptr);
}
...
QTextEdit *teStdCout = new QTextEdit(wParent);
m_stdRedirector = new StdRedirector<>(std::cout, outcallback, teStdCout);
@ - Callback Function.
-
-
I think the problem is caused by buffering in the external program. Most programs written in C will buffer unless their output stream is a tty. When it isn't, you have to fake it using something like unbuffer (on Unix/Linux - part of the expect package - http://expect.sourceforge.net - which IIUC works by faking tty behaviour).
-
In the QProcess external program, how are you outputting the text to be read in?
I had a similar issue with my QProcess. When I output to stdout, the readyRead() signal wouldn't emit until after it was already finished since I didn't have to output very much text. I found this link (http://www.lubby.org/ebooks/qtconsoleapp2/qtconsoleapp2.html) to be very helpful for outputting to stdout.
If you don't want to read the link, stdout (or the QTextStream, if you went that route) has a built-in buffer that needs to be flushed manually to trigger the readyRead() signal. If your output looks like this:
@QTextStream o(stdout);
o << "Line of text here";@then readyRead() won't trigger unless you do this thousands of times or the QProcess finishes. I've found two options that work, depending on your implementation:
@QTextStream o(stdout);
o << "Line of text here" << std::endl;@or
@QTextStream o(stdout);
o << "Line of text here\n";
o.flush();@Either way, you will trigger the stdout buffer to be flushed and readyRead() should trigger. As I understand it (I'm no expert), "std::endl" is the same as "\n" except that it also triggers a flush in addition to starting a new line. I prefer the second method so that I don't get extra newline characters floating around.
Hope that helps if anyone runs into this issue.