QPainter effects removed when rendering a QOpenGLWidget into a QPixmap
-
I have a
QOpenGLWidget
that does some OpenGL rendering and some non-OpenGL rendering using aQPainter
. The widget looks OK but when I try to create a screenshot of the widget by rendering it into aQPixmap
, suddenly all effects that are painted usingQPainter
disappear. Here is a minimal code sample to reproduce the issue:#include <QApplication> #include <qopenglwidget.h> #include <qopenglfunctions.h> #include <qpushbutton.h> #include <QPainter> class MainWidget : public QOpenGLWidget, QOpenGLFunctions{ public: QPushButton* screenshot_button; MainWidget(QWidget* parent=nullptr) : QOpenGLWidget(parent){ screenshot_button = new QPushButton("sreenshot", this); QObject::connect(screenshot_button, &QPushButton::clicked, [this](){ take_screenshot(); }); } void initializeGL(){ initializeOpenGLFunctions(); } void take_screenshot(){ QPixmap pixmap(size()); render(&pixmap, QPoint(), QRegion(rect())); pixmap.save("screenshot.png"); } void paintGL(){ QPainter painter(this); painter.beginNativePainting(); glClearColor(0.80, 0.80, 0.80, 1); glClear(GL_COLOR_BUFFER_BIT); painter.endNativePainting(); // this disappears when the screenshot button is pressed! // also it is not present in the screenshot painter.drawRect(QRect(0, 0, 100, 100)); } }; int main(int argc, char *argv[]) { QApplication a(argc, argv); MainWidget w; w.show(); return a.exec(); }
Before pressing the button the widget looks normal:
But after pressing the screenshot button, the rectangle disappears from the widget (also it is absent from
screenshot.png
) until I resize the window which forces a re-render.
I am using
qt 6.5
on windows 10. -
Hi,
What do you get if using grabFramebuffer ?
-
When I replace the contents of
take_screenshot
with this:QImage img = grabFramebuffer(); QPixmap pixmap = QPixmap::fromImage(img); pixmap.save("screenshot.png");
Here is the resulting screenshot:
Now it has the rectangle but it still is not what I want because it doesn't render the rest of the widgets.
-
When I replace the contents of
take_screenshot
with this:QImage img = grabFramebuffer(); QPixmap pixmap = QPixmap::fromImage(img); pixmap.save("screenshot.png");
Here is the resulting screenshot:
Now it has the rectangle but it still is not what I want because it doesn't render the rest of the widgets.
-
I think that it would rather be a feature request.
One possible workaround (I haven't tried it though) would be to grab the framebuffer and then apply the painting on the returned image. Not ideal but could be enough for your needs.
-
I think that it would rather be a feature request.
One possible workaround (I haven't tried it though) would be to grab the framebuffer and then apply the painting on the returned image. Not ideal but could be enough for your needs.
@SGaist Thanks, while this might work for the simple minimal code that I provided, the actual use case is in a complex application with many widgets drawn on top of the QOpenGLWidget. So simply drawing on the pixmap writes on the other widgets drawn on top of the QOpenGLWiddget.
Also I don't know if it is a feature request. What feature exactly am I requesting XD? I just want
render
to work correctly when rendering into a pixmap. -
Solution 1:
From
QScreen::grabWindow()
documentation:The grabWindow() function grabs pixels from the screen, not from the window, i.e. if there is another window partially or entirely over the one you grab, you get pixels from the overlying window, too.
Meaning it grabs from the framebuffer unlike
QWidget::grab
andQWidget::render
which ask the widget to render itself.So it could be used instead of
render()
to avoid the problems caused by combining the latter with OpenGL:void take_screenshot() { QGuiApplication::primaryScreen()->grabWindow(winId()).save("screenScreenshot.png"); }
Solution 2:
Less straightforward but just in case
render
is a must.Use a
bool
to disable openGL when callingrender
intake_screenshot()
, and make sure you callgrabFramebuffer()
before.void take_screenshot() { grabFramebuffer(); QPixmap pixmap(size()); enable_opengl=false; render(&pixmap, QPoint(), QRegion(rect())); enable_opengl=true; pixmap.save("renderScreenshot.png"); } void paintGL() { QPainter painter(this); if(enable_opengl) { glClearColor(0.80, 0.80, 0.80, 1); glClear(GL_COLOR_BUFFER_BIT); } painter.fillRect(QRect(10, 10, 100, 100), Qt::green); }
-
If possible, I'm hoping to get an explanation about the 2nd solution, because I stumbled upon it while experimenting.
I have never used OpenGL, but I'm interested in why calling
grabFrameBuffer()
and disabling OpenGL before callingrender
fixes the problem (I'm not even sure it does fix it)?I read that grabFrameBuffer() relies on glReadPixels(), and read its documentation, but it's still unclear to me.
I also took a look at grabFrameBuffer source code to try and see if I can replicate what it's doing without calling it, but failed to understand that as well.
I'm just trying to learn something new, so it's not a problem I'm facing or an urgency.
Thanks if anyone decides to look into this!