How to get the amount of total and available GPU(Cuda) memory?(QOpenGLFunctions)
-
I'm trying to write a program for monitoring the statistics of my laptop, and bumped into an issue. By googling I found that I can get the total and available GPU memory by executing such code snippet, but in result the application crashes on start. Can you give me a hint what am I doing wrong?? Thank you.
Compiler: 5.6.2 MSVC 2013 (32bit)
OS: W7
Code Snippet:
#include "mainwindow.h"
#include <QApplication>
#include <QDebug>
#include <qopengl.h>
#include <QOpenGLFunctions>
#define GL_GPU_MEM_INFO_TOTAL_AVAILABLE_MEM_NVX 0x9048
#define GL_GPU_MEM_INFO_CURRENT_AVAILABLE_MEM_NVX 0x9049
int main(int argc, char *argv[])
{
QApplication a(argc, argv);
QOpenGLFunctions func;
GLint total_mem_kb = 0;
func.glGetIntegerv(GL_GPU_MEM_INFO_TOTAL_AVAILABLE_MEM_NVX,&total_mem_kb);
GLint cur_avail_mem_kb = 0;
func.glGetIntegerv(GL_GPU_MEM_INFO_CURRENT_AVAILABLE_MEM_NVX,&cur_avail_mem_kb);
qDebug()<<cur_avail_mem_kb;
return a.exec();
} -
Hi!
QOpenGLFunctions func;
You can't use
func
without an OpenGL context / until you've initialized it. Please consult the corresponding manual page. -
@Wieland Like this? (but it crashes) The context is being created, I have checked that
QOpenGLContext *_context = new QOpenGLContext(this); _context->create(); _context->functions()->initializeOpenGLFunctions(); QOpenGLFunctions func (QOpenGLContext::currentContext()); GLint total_mem_kb = 0; func.glGetIntegerv(GL_GPU_MEM_INFO_TOTAL_AVAILABLE_MEM_NVX,&total_mem_kb); GLint cur_avail_mem_kb = 0; func.glGetIntegerv(GL_GPU_MEM_INFO_CURRENT_AVAILABLE_MEM_NVX,&cur_avail_mem_kb); qDebug()<<cur_avail_mem_kb;
-
#include <QGuiApplication> #include <QOpenGLFunctions> #include <QOpenGLContext> #include <QOffscreenSurface> #include <QDebug> #define GL_GPU_MEM_INFO_TOTAL_AVAILABLE_MEM_NVX 0x9048 #define GL_GPU_MEM_INFO_CURRENT_AVAILABLE_MEM_NVX 0x9049 int main(int argc, char *argv[]) { QGuiApplication a(argc, argv); QOpenGLContext context; context.create(); QOffscreenSurface surface; surface.create(); QOpenGLFunctions func; context.makeCurrent(&surface); func.initializeOpenGLFunctions(); GLint total_mem_kb = 0; func.glGetIntegerv(GL_GPU_MEM_INFO_TOTAL_AVAILABLE_MEM_NVX,&total_mem_kb); GLint cur_avail_mem_kb = 0; func.glGetIntegerv(GL_GPU_MEM_INFO_CURRENT_AVAILABLE_MEM_NVX,&cur_avail_mem_kb); qDebug()<<cur_avail_mem_kb; return a.exec(); }
-
@Wieland Thank you.
-
@Wieland An issue occured, using the functions above don't give the expected result, they return 0, but using the function above tell that they will return memory usage by the GPU, what can be the issue??
-
Huh? The function returns void, not 0:
void glGetIntegerv(GLenum pname, GLint * data)
. The results you asked for are stored intotal_mem_kb
andcur_avail_mem_kb
. -
@Wieland Yes I know. I meant that this two variables are not being init in the function.
QOpenGLContext context; context.create(); QOffscreenSurface surface; surface.create(); QOpenGLFunctions func; context.makeCurrent(&surface); func.initializeOpenGLFunctions(); qDebug()<<func.glGetString(GL_RENDER);
qDebug prints 0x0 smth, is for sure wrong
-
Hmm.. worked for me. Tested on Nvidia GTX 1080, 64 bit desktop Linux, with Nvidia's proprietary drivers. Don't have that machine at hand right now, can't tell for sure which driver version it has.
-
@Wieland Have you tried it on Windows?? ran this code on W7 and W10 different laptops, the result was 0x0
-
Wait, I can check with Windows 7 laptop.
-
Okay, the code I posted above works here, too.
qDebug() << QString::fromLatin1((const char*)func.glGetString(GL_EXTENSIONS));
prints all the extentions,qDebug() << QString::fromLatin1((const char*)func.glGetString(GL_RENDER));
prints an empty string. -
@Wieland tried the above code and it gives a list of extensions. But the main thing I need is the available GPU memory.
QOpenGLContext context; context.create(); QOffscreenSurface surface; surface.create(); QOpenGLFunctions func; context.makeCurrent(&surface); func.initializeOpenGLFunctions(); GLint cur_avail_mem_kb = 0; func.glGetIntegerv(GL_GPU_MEM_INFO_CURRENT_AVAILABLE_MEM_NVX,&cur_avail_mem_kb); qDebug()<<cur_avail_mem_kb;
cur_avail_mem_kb - is equals to the number it was initialized, in this case it's zero. What can be the problem? Does this code executes on your computer?
-
Yes, the following prints the correct results:
GLint total_mem_kb = 0; func.glGetIntegerv(GL_GPU_MEM_INFO_TOTAL_AVAILABLE_MEM_NVX,&total_mem_kb); qDebug()<<total_mem_kb; GLint cur_avail_mem_kb = 0; func.glGetIntegerv(GL_GPU_MEM_INFO_CURRENT_AVAILABLE_MEM_NVX,&cur_avail_mem_kb); qDebug()<<cur_avail_mem_kb;
-
@Wieland A big thank you for the time you invested in solving the problem. It looks like then smth is wrong on my side, now I'm updating my drivers and after I'll give it a try again.
-
@mandruk1331 You're welcome. Good luck, hope you can solve it. It's almost always driver-related issues.
-
@Wieland Ok so I updated everything I could, but that did not solve the problem, but by asking the same question on StackOverflow, I now have a solution. The code above works ok, the main problem was that my main graphic card was Intel, therefore I had to switch the video card setting so that the main video card would be Nvidia, by doing that the problem has been solved.
-
@mandruk1331 Hey, glad you solved it. And thanks for sharing the solution :)