Solved QProcess have problem on file descriptor
-
I made function and get html value from url using QProcess.
But the problem is QProcess have problem to access on file.
nw_path_close_fd Failed to close guarded necp fd 18 [9: Bad file descriptor]
I saw this log when I run process.
I made function to get file permission function. I run timer and process work as certain order.I want to fix "Bad file descriptor".
LOG
770:264436] bytedata "[]\n" cmd "curl \"http://dbpedia.org/data/Haeinsa.json\"" nw_path_close_fd Failed to close guarded necp fd 18 [9: Bad file descriptor] "Location" cmd "curl \"http://dbpedia.org/data/Haeinsa.json\" | perl /Users/yoshimi/Downloads/dbpediaPlaceData.txt 'Haeinsa'" **nw_path_close_fd Failed to close guarded necp fd 18 [9: Bad file descriptor]** cmd "curl \"https://en.wikipedia.org/api/rest_v1/page/media/Haeinsa\" | perl /Users/yoshimi/Downloads/WikiJSONimages.txt" **nw_path_close_fd Failed to close guarded necp fd 18 [9: Bad file descriptor]** cmd "curl \"https://en.wikipedia.org/w/index.php?title=Haeinsa&action=render\" | perl /Users/yoshimi/Downloads/WikiHTMLcleaner.txt" **nw_path_close_fd Failed to close guarded necp fd 18 [9: Bad file descriptor]** max processes reached! Stopping timer!
Code
QFile* getFilePermission(QString paths){ m_temporaryFile = new QFile(paths); m_temporaryFile->open(QIODevice::ReadWrite|QIODevice::Text); m_temporaryFile->setPermissions(QFile::Permissions(QFile::ExeOwner|QFile::ReadOwner)); return m_temporaryFile; } QByteArray UnixCommand::execCommandAsNormalUserExt(int& no_process,QString mKeyword, QString mRecordtype,QString mSubtype) { static bool flag = true; QByteArray res; QString command = "", keyword = ""; keyword = ParserNS::convertTitle(mKeyword); QProcess p; QString fileName; QProcessEnvironment env = QProcessEnvironment::systemEnvironment(); env.insert("SHELL", "/bin/sh"); p.setProcessEnvironment(env); QString program =getShell(); if(flag == true) { setDefaultPath(); flag = false; } if(no_process == 0 ) // bring subtype from dbpedia JSON { mName= "dbpediaplace"; if(mSubtype.isEmpty()) command = QString("curl \"http://dbpedia.org/data/%1.json\"").arg(keyword); else no_process++; } if(no_process == 1) { if(mRecordtype == ParserNS::JsonItem::Recordtype::THING) { QFile *tmp = getFilePermission(dbpediaPlaceFile); fileName = tmp->fileName(); mName = fileName.toLower(); command = QString("curl \"http://dbpedia.org/data/%1.json\"" " | perl %2 '%3'").arg(keyword).arg( fileName).arg(mKeyword); delete tmp; } else if(mRecordtype == ParserNS::JsonItem::Recordtype::MEDIUM || mRecordtype == ParserNS::JsonItem::Recordtype::EVENT) { no_process+=1; } } if(no_process ==2) { QFile *tmp = getFilePermission(wikiJSONImagesFile); fileName = tmp->fileName(); mName = fileName.toLower(); command = QString("curl \"https://en.wikipedia.org/api/rest_v1/page/media/%1\"" " | perl %2").arg(keyword).arg(fileName); delete tmp; } else if(no_process == 3) { QFile *tmp = getFilePermission(wikipediaHTMLFile); fileName = tmp->fileName(); mName = fileName.toLower(); command = QString("curl \"https://en.wikipedia.org/w/index.php?title=%1&action=render\"" " | perl %2").arg(keyword).arg(fileName); delete tmp; } qDebug() << "cmd" << command; p.start(program,{QStringLiteral("-c"), command}); p.waitForFinished(-1); res = p.readAllStandardOutput(); p.close(); return res; }
-
Hi,
From the QProcess documentation, it seems that you should rather use setStandardOutputProcess to pipe things from one process to the other.
-
I found what problem is. I called QProcess while program thread is running.
That's why QProcess have problem accessing file system.
I modify my code applying your advice. Result gets well.
Thanks to giving advice.void ProcessManager::start_process(QString name) { // Use resizable buffers, unlike the system. QByteArray byteerr; QByteArray byteout; _processes.append(curlProcess); // setup_connections(curlProcess,name); QString command = QString("curl \"http://dbpedia.org/data/%1.json\"").arg(mKeyword); QStringList args; args << "-c" << command; curlProcess->setStandardOutputProcess(perlProcess); curlProcess->start("/bin/sh",args); if (curlProcess->waitForFinished(-1)) { //perlProcess->setProcessChannelMode(QProcess::ForwardedChannels); setup_connections(perlProcess,name); perlProcess->start("perl", { dbpediaPlaceFile, "Haeinsa"},QIODevice::ReadOnly); // Give the child process some time to start. perlProcess->waitForStarted(); if(perlProcess->waitForFinished(-1)) { // Read all available data on both output streams. byteerr += perlProcess->readAllStandardError(); byteout += perlProcess->readAllStandardOutput(); qDebug() << byteerr; qDebug() << byteout; } } }
-
@darongyi
I believe here you are running the firstcurl
process to completion (curlProcess->waitForFinished(-1)
) and then starting theperl
process, with output from thecurl
into input in theperl
.This should block and get stuck depending on the amount of output from the
curl
. Try it with more than something like 4 or 8K coming from thecurl
and you should "hang".Assuming I am correct, you must not write it this way. You cannot afford to
waitForFinished()
on a process writing to an output pipe. You must start the reading process before you try to wait for the writing process to finish. -
@JonB Thank u very much. I'm editing code..
-
@darongyi
You don't actually need towaitForFinished()
on anything other than the last command in a pipeline. Effectively each command waits for the one to the left of it to finish.I'd do something like:
-
Create the
QProcess
for each process. -
Direct the output from the
curl
into the input for theperl
. -
Start the
curl
, and probablywaitForStarted()
that one. -
Start the
perl
. -
waitForFinished()
on theperl
one.
One other thing: all you seem to do is run the
curl
, send all its output to theperl
, and then pick up the output from theperl
. Nothing more complicated. Therefore you might find it much simpler code-wise to just run a single command:start("cmd", QStringList() << "/c" << "curl ... | perl ...");
-