Important: Please read the Qt Code of Conduct -

QProcess have problem on file descriptor

  • I made function and get html value from url using QProcess.
    But the problem is QProcess have problem to access on file.
    nw_path_close_fd Failed to close guarded necp fd 18 [9: Bad file descriptor]
    I saw this log when I run process.
    I made function to get file permission function. I run timer and process work as certain order.

    I want to fix "Bad file descriptor".


    770:264436] bytedata  "[]\n"
     cmd "curl \"\""
    nw_path_close_fd Failed to close guarded necp fd 18 [9: Bad file descriptor]
    cmd "curl \"\" | perl /Users/yoshimi/Downloads/dbpediaPlaceData.txt 'Haeinsa'"
    **nw_path_close_fd Failed to close guarded necp fd 18 [9: Bad file descriptor]**
    cmd "curl \"\" | perl /Users/yoshimi/Downloads/WikiJSONimages.txt"
    **nw_path_close_fd Failed to close guarded necp fd 18 [9: Bad file descriptor]**
    cmd "curl \"\" | perl /Users/yoshimi/Downloads/WikiHTMLcleaner.txt"
    **nw_path_close_fd Failed to close guarded necp fd 18 [9: Bad file descriptor]**
    max processes reached! Stopping timer!


       QFile* getFilePermission(QString paths){
        m_temporaryFile = new QFile(paths);
        return m_temporaryFile;
    QByteArray UnixCommand::execCommandAsNormalUserExt(int& no_process,QString mKeyword,
                                                       QString mRecordtype,QString mSubtype)
      static bool flag = true;
      QByteArray res;
      QString command = "", keyword = "";
      keyword = ParserNS::convertTitle(mKeyword);
      QProcess p;
      QString fileName;
      QProcessEnvironment env = QProcessEnvironment::systemEnvironment();
      env.insert("SHELL", "/bin/sh");
      QString program =getShell();
      if(flag == true)
          flag = false;
      if(no_process == 0 )   // bring subtype from dbpedia JSON
          mName= "dbpediaplace";
            command  = QString("curl \"\"").arg(keyword);
      if(no_process == 1)
           if(mRecordtype == ParserNS::JsonItem::Recordtype::THING)
             QFile *tmp = getFilePermission(dbpediaPlaceFile);
             fileName = tmp->fileName();
             mName =  fileName.toLower();
             command  = QString("curl \"\""
                           " | perl %2 '%3'").arg(keyword).arg( fileName).arg(mKeyword);
             delete tmp;
          else if(mRecordtype == ParserNS::JsonItem::Recordtype::MEDIUM
                  || mRecordtype == ParserNS::JsonItem::Recordtype::EVENT)
      if(no_process ==2)
         QFile *tmp = getFilePermission(wikiJSONImagesFile);
         fileName = tmp->fileName();
         mName =  fileName.toLower();
         command = QString("curl \"\""
                           " | perl %2").arg(keyword).arg(fileName);
         delete tmp;
     else if(no_process == 3)
          QFile *tmp = getFilePermission(wikipediaHTMLFile);
          fileName = tmp->fileName();
         mName =  fileName.toLower();
         command = QString("curl \"\""
                            " | perl %2").arg(keyword).arg(fileName);
         delete tmp;
      qDebug() << "cmd" << command;
      p.start(program,{QStringLiteral("-c"), command});
      res = p.readAllStandardOutput();
      return res;

  • Lifetime Qt Champion


    From the QProcess documentation, it seems that you should rather use setStandardOutputProcess to pipe things from one process to the other.

  • I found what problem is. I called QProcess while program thread is running.
    That's why QProcess have problem accessing file system.
    I modify my code applying your advice. Result gets well.
    Thanks to giving advice.

    void ProcessManager::start_process(QString name)
        // Use resizable buffers, unlike the system.
       QByteArray byteerr;
       QByteArray byteout;
     //  setup_connections(curlProcess,name);
       QString command  = QString("curl \"\"").arg(mKeyword);
       QStringList args;
       args << "-c" << command;
        if (curlProcess->waitForFinished(-1))
             perlProcess->start("perl",  { dbpediaPlaceFile, "Haeinsa"},QIODevice::ReadOnly);
         // Give the child process some time to start.
                 // Read all available data on both output streams.
             byteerr += perlProcess->readAllStandardError();
             byteout += perlProcess->readAllStandardOutput();
             qDebug() << byteerr;
             qDebug() << byteout;

  • @darongyi
    I believe here you are running the first curl process to completion (curlProcess->waitForFinished(-1)) and then starting the perl process, with output from the curl into input in the perl.

    This should block and get stuck depending on the amount of output from the curl. Try it with more than something like 4 or 8K coming from the curl and you should "hang".

    Assuming I am correct, you must not write it this way. You cannot afford to waitForFinished() on a process writing to an output pipe. You must start the reading process before you try to wait for the writing process to finish.

  • @JonB Thank u very much. I'm editing code..

  • @darongyi
    You don't actually need to waitForFinished() on anything other than the last command in a pipeline. Effectively each command waits for the one to the left of it to finish.

    I'd do something like:

    1. Create the QProcess for each process.

    2. Direct the output from the curl into the input for the perl.

    3. Start the curl, and probably waitForStarted() that one.

    4. Start the perl.

    5. waitForFinished() on the perl one.

    One other thing: all you seem to do is run the curl, send all its output to the perl, and then pick up the output from the perl. Nothing more complicated. Therefore you might find it much simpler code-wise to just run a single command:

    start("cmd", QStringList() << "/c" << "curl ... | perl ...");

Log in to reply