0

I am running a QProcess in a timer slot at 1 Hz. The process is designed to evoke a Linux command and parse it's output.

The problem is this: after the program runs for about 20 minutes, I get this error:

QProcessPrivate::createPipe: Cannot create pipe 0x104c0a8: Too many open files
QSocketNotifier: Invalid socket specified

Ideally, this program would run for the entire uptime of the system, which may be days or weeks.

I think I've been careful with process control by reading the examples, but maybe I missed something. I've used examples from the Qt website, and they use the same code that I've written, but those were designed for a single use, not thousands. Here is a minimum example:

class UsageStatistics : public QObject {
    Q_OBJECT 
public:
    UsageStatistics() : process(new QProcess) {
       timer = new QTimer(this);
       connect(timer, SIGNAL(timeout()), this, SLOT(getMemoryUsage()));
       timer->start(1000); // one second
    }

    virtual ~UsageStatistics() {}

public slots:

    void getMemoryUsage() {
        process->start("/usr/bin/free");
        if (!process->waitForFinished()) {
            // error processing
        }

        QByteArray result = process->realAll();
        // parse result 

        // edit, I added these
        process->closeReadChannel(QProcess::StandardOutput);
        process->closeReadChannel(QProcess::StandardError);
        process->closeWriteChannel();
        process->close();
    }
}

I've also tried manually deleting the process pointer at the end of the function, and then new at the beginning. It was worth a try, I suppose.

Free beer for whoever answers this :)

Tyler Jandreau
  • 4,245
  • 1
  • 22
  • 47
  • You are leaking handles somewhere, or you have started too many QProcess-es at the same time. See https://bugreports.qt-project.org/browse/QTBUG-18934 – sashoalm Apr 26 '13 at 13:44
  • 1
    This isn't an answer to your question, but if your goal is to find out how much memory is free in the system, I'd suggest avoiding QProcess entirely and using a more lightweight mechanism, like fopen("/proc/meminfo", "r") and just read out the data directly. More efficient and less error prone :) – Jeremy Friesner Apr 26 '13 at 14:47

4 Answers4

1

QProcess is derived from QIODevice, so I would say calling close() should close the file handle and solve you problem.

cloose
  • 916
  • 5
  • 18
1

I cannot see the issue, however one thing that concerns me is a possible invocation overlap in getMemoryUsage() where it's invoked before the previous run has finished.

How about restructuring this so that a new QProcess object is used within getMemoryUsage() (on the stack, not new'd), rather than being an instance variable of the top-level class? This would ensure clean-up (with the QProcess object going out-of-scope) and would avoid any possible invocation overlap.

Alternatively, rather than invoking /usr/bin/free as a process and parsing its output, why not read /proc/meminfo directly yourself? This will be much more efficient.

trojanfoe
  • 120,358
  • 21
  • 212
  • 242
  • I'm not directly reading `/proc/meminfo` directly because it is much more difficult to parse. I actually have a function that does exactly that, but when benchmarking, I found the performance difference negligible, so I opted for easily understandable code. But, if `QProcess` doesn't pan out, I'll just use the original parse function. – Tyler Jandreau Apr 26 '13 at 16:21
  • What about the system call [sysinfo](http://man7.org/linux/man-pages/man2/sysinfo.2.html)? This call also returns the free memory on linux. – cloose Apr 26 '13 at 17:03
1

First I had the same situation with you. I got the same results. I think that QProcess can not handle opened pipes correctly.

Then, instead of QProcess, I have decided to use popen() + QFile().

class UsageStatistics : public QObject {
Q_OBJECT 
public:
UsageStatistics(){
   timer = new QTimer(this);
   connect(timer, SIGNAL(timeout()), this, SLOT(getMemoryUsage()));
   timer->start(1000); // one second
}

virtual ~UsageStatistics() {}

private:
    QFile freePipe;
    FILE *in;

public slots:

void getMemoryUsage() {

    if(!(in = popen("/usr/bin/free", "r"))){
            qDebug() << "UsageStatistics::getMemoryUsage() <<" << "Can not execute free command.";
            return;
    }

    freePipe.open(in, QIODevice::ReadOnly);
    connect(&freePipe, SIGNAL(readyRead()), this, SLOT(parseResult()) );
    // OR waitForReadyRead() and parse here.
}

void parseResult(){
    // Parse your stuff
    freePipe.close();
    pclose(in); // You can also use exit code by diving by 256.
}
}
ilo
  • 11
  • 4
0

tl;dr:
This occurs because your application wants to use more resources than allowed by the system-wide resource limitations. You might be able to solve it by using the command specified in [2] if you have a huge application, but it is probably caused by a programming error.

Long:
I just solved a similar problem myself. I use a QThread for the logging of exit codes of QProcesses. The QThread uses curl to connect to a FTP server uploads the log. Since I am testing the software I didn't connect the FTP server and curl_easy_perform apparently waits for a connection. As such, my resource limit was reached and I got this error. After a while my program starts complaining, which was the main indicator to figure out what was going wrong.

[..]
QProcessPrivate::createPipe: Cannot create pipe 0x7fbda8002f28: Too many open files
QProcessPrivate::createPipe: Cannot create pipe 0x7fbdb0003128: Too many open files
QProcessPrivate::createPipe: Cannot create pipe 0x7fbdb4003128: Too many open files
QProcessPrivate::createPipe: Cannot create pipe 0x7fbdb4003128: Too many open files
[...]
curl_easy_perform() failed for curl_easy_perform() failed for disk.log
[...]

I've tested this by connecting the machine to the FTP server after this error transpired. That solved my problem.

Read:
[1] https://linux.die.net/man/3/ulimit
[2] https://ss64.com/bash/ulimit.html
[3] https://bbs.archlinux.org/viewtopic.php?id=234915

Bayou
  • 3,293
  • 1
  • 9
  • 22