Why deviates in opengl the gl_FragCoord.z from the depth buffer one?
-
Hi,
I have been trying for some time to find a way to read in opengl the depth value for a particular mouse coordinate (x, y). Everything works fine on win10 with opengl 4.x, but not for opengl es 3.xMy approaches:
1.) glReadPixels() does not work on openGL es for depth buffer
2.) ray cast is not suitable because I work with a large terrain model
3.) subsequent method would suffice, but unfortunately too inaccurate, also on win10 but why?#version 420 uniform vec2 screenXy; uniform vec2 screenSize; out vec4 fragColor; void main(void) { if((int(gl_FragCoord.x) == int(screenXy.x)) && ((int(screenSize.y) - int(gl_FragCoord.y)) == int(screenXy.y))) { fragColor.r = gl_FragCoord.z; } else { fragColor = vec4(1, 1, 1, 1.0); } }
I submit the mouse xy coordinates to the fragementshader (screenXy). If the clicked pixel is in the row, I write the depth value in the color buffer. This works, but the value gl_FragCoord.z and the one from the depth buffer are not exactly the same (I know this one from the depth buffer is correct). Although gl_FragCoord.z and the depth buffer value is float, and so I think 32bit.
GLfloat zd; // from depth buffer GLfloat z[4]; // from color buffer m_func->glReadPixels(xy.x(), m_pFbo->height() - xy.y(), 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, &zd); m_func->glReadPixels(xy.x(), m_pFbo->height() - xy.y(), 1, 1, GL_RGBA, GL_FLOAT, z);
Reasons:
1.) the deviation occurs through an internal type conversion, but where?
2.) because the GL_DEPTH_TEST is executed after the fragmentshader gl_FragCoord.z is not the closest one (to the camera), but which is saved in the depth buffer. So it would also make no sense to save gl_FragCoord.z in a separat Frambuffer, because its not the correct value.Can maybe someone help me and solve the knot, because I can not find any other explanation?
Here some measured values:
zc 0.984314 zd 0.985363 zc 0.552941 zd 0.554653 zc 1 -> extremly critical zd 0.999181
Thxs in andvance...