glArea renders a specific shader with low resolution (compared to glfw)

I’m trying to modify the classical sample of the GtkGLArea by replacing the triangle with the Suzanne obj shader from this old, classical OpenGL tutorial 8 (I’m new to this discourse, I have max 2 links, so I can’t put the link to www dot opengl-tutorial dot org/beginners-tutorials/tutorial-8-basic-shading).
Of course I’ve built the source from the latest OpenGL and GTK repo on my laptop HW with Linux (Fedora) and I confirm that the GLFW version perfectly renders it while unfortunately it is very low resolution in the GLArea.

I have opened an issue on gitlab about this.

Here I see automatically suggested a potential solution. Will try it an let you know.

Full code in my gist (gist dot github dot com/giuliohome/2b86e64a0186e307f53c3746bfaa6102)).

Below the specific part with VAO and the shader loader (originally coming from: github dot com/opengl-tutorials/ogl/blob/master/common/shader.cpp#L17)

glGenVertexArrays(1, &VertexArrayID);
glBindVertexArray(VertexArrayID);
	
m_Program = glCreateProgram();
 // Create and compile our GLSL program from the shaders
LoadShaders(m_Program, "StandardShading.vertexshader", "StandardShading.fragmentshader" );

I’d like to point out that the gthree library loads very detailed and high resolution models—including the monkey head—and uses GtkGLArea without any issue.

Great, I’ll try.

just a quick update, I’ve done

GLint default_VAO;
glGetIntegerv(GL_VERTEX_ARRAY_BINDING, &default_VAO);
std::cerr << "default VAO " << std::to_string(default_VAO) << "\n";
std::cerr << "my VAO " << std::to_string(m_Vao) << "\n";

but nothing change because the default VAO is 0 like my VAO, below the output is

[giuliohome@localhost glarea]$ ./myglarea 
default VAO 0
my VAO 0

So, I’m confused… :confused: and I’m only a beginner, I can’t solve this issue.

Now I will definitely try the gthree library. Thank you. I hope it is the solution, but I have to study and understand the required changes in the meantime…

@ebassi quick question: do you already have an example of GtkGLArea with gthree library?
Thanks again.

I don’t understand what that means, sorry.

Gthree is already using a GtkGLArea, through the GthreeArea subclass of GtkGLArea.

Gthree has multiple examples as well.

Yes, I see, found a useful issue opened with the instructions to compile. :cool: I’m running the samples, yes they look very nice! Which is the one with the “monkey head:monkey_face: :monkey:, i.e. Suzanne obj shader as in my original attempt? That was my question, out of my curiosity.

Thank you

Edit
Never mind: it’s in ./performance-gtk4! (btw I’ve changed meson default from gtk3 to gtk4)

However most of the examples use a different technology and notice that Suzanne.js in the performance sample is not rendered with the colors. Yes there is an example of vertex and fragment shaders but it’s done with cubes and the code is quite different to do a technical comparison. Also, if the gthree area is a subclass of the gtk gl area, I would still wonder which is the real problem in my initial question and in my initial app? Maybe the drivers or the OpenGL version? Anything relevant below?

giuliohome@localhost glarea]$ glxinfo | grep "version"
server glx version string: 1.4
client glx version string: 1.4
GLX version: 1.4
    Max core profile version: 3.3
    Max compat profile version: 3.0
    Max GLES1 profile version: 1.1
    Max GLES[23] profile version: 3.0
OpenGL core profile version string: 3.3 (Core Profile) Mesa 21.1.7
OpenGL core profile shading language version string: 3.30
OpenGL version string: 3.0 Mesa 21.1.7
OpenGL shading language version string: 1.30
OpenGL ES profile version string: OpenGL ES 3.0 Mesa 21.1.7
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00

I still see a warning that I can’t understand when I run my app:

myglarea:36138): Gdk-WARNING **: 00:49:45.889: OPENGL:
    Source: API
    Type: Error
    Severity: High
    Message: GL_INVALID_ENUM in glDrawBuffers(invalid buffer GL_BACK)

It seems to have to do with something internal to the GTK+ mechanism, the buffers swapping?

Edit

Interesting, the above said warning disappears if I login in GNOME Xorg instead of Wayland!

Edit 2

And the warning is fixed also in Wayland with

export GDK_BACKEND=wayland
export GSK_RENDERER=cairo

Indeed it become a message instead of a warning, visible with

export GDK_DEBUG=opengl

and it reads Flushing GLX buffers for drawable… (more details below)

Gdk-Message: 11:41:54.053: OpenGL version: 3.3 (core)
* GLSL version: 3.30
* Extensions checked:
 - GL_ARB_texture_non_power_of_two: yes
 - GL_ARB_texture_rectangle: yes
 - GL_KHR_debug: yes
 - GL_EXT_unpack_subimage: yes
* Using texture rectangle: no
default VAO 0
my VAO 0
Loading OBJ file suzanne.obj...
Compiling shader : StandardShading.vertexshader
Compiling shader : StandardShading.fragmentshader
Linking program
Gdk-Message: 11:41:54.075: Making GLX context 0x20c4eb0 current to drawable 31457289
Gdk-Message: 11:41:54.076: Making GLX context 0x20c4eb0 current to drawable 31457294
Gdk-Message: 11:41:54.083: Flushing GLX buffers for drawable 31457294 (window: 31457284), frame sync: no

Anyway even after fixing the warning via Gnome Xorg, the original issue persists.

If I try the suggestion from this gitlab issue, namely Using LIBGL_ALWAYS_SOFTWARE=true to switch to llvmpipe, I see OpenGL versioni 4.5 (core) and now Suzanne renders without colors.

vs the unset LIBGL_ALWAYS_SOFTWARE with OpenGL version 3.3 (core) that is colored but low resolution

Closed as per my final remarks in gitlab.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.