Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

On Windows, consider also the information from pixel format descriptor in SDL_GL_GetAttribute() queries #12070

Open
cher-nov opened this issue Jan 23, 2025 · 0 comments
Assignees
Milestone

Comments

@cher-nov
Copy link

cher-nov commented Jan 23, 2025

The current implementation of SDL_GL_GetAttribute() is not platform-specific. Fine from a maintainability standpoint, it misses a notorious peculiarity of Windows: the information from drivers that is required not only for the proper operation of user programs but also for system routines, may turn out to be more accurate and plausible, as opposed to what is retrieved by applications alone.

What prompted me to write this request was the fact that on my laptop (Intel HD Graphics 3000, Windows 7 x64, SDL2, don't look at me like that) the call to SDL_GL_GetAttribute(SDL_GL_STENCIL_SIZE,) results in GL_INVALID_OPERATION, although it succeeds when switching to nVIDIA GeForce GT 520MX on the same machine. The stencil buffer was acquired for the GL context via SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 8). Moreover, the 8-bit stencil buffer is supported by both video chips, and my rendering code that uses it seems to work. So I believe there is a bug in the ICD driver.

Of course, I updated my Intel HD Graphics 3000 drivers from stock 8.15.10.2622 to latest 9.17.10.4229, but the behavior remains the same: stencil buffer is available, but information about its bit depth is not accessible through SDL. However, one can ask WGL for this (and here I mean GDI, with which it is inextricably tied), which is anyway necessary for context creation. Well, the modern wglGetPixelFormatAttribivARB() call with WGL_STENCIL_BITS_ARB, as well as the classic DescribePixelFormat() function that populates PIXELFORMATDESCRIPTOR.cStencilBits - both produce the correct answer. Thus, we return here to my initial assumption: if the Intel driver also deceived GDI, it would lead to much bigger problems than a frustrated developer lost somewhere in the Far East.

In the end, my suggestion is to rely in SDL_GL_GetAttribute() not only on information from the OpenGL implementation, but also from the operating system itself. Unfortunately, I really don’t know which of them should be considered a higher priority, and when.

Here is a thread about a quite similar problem: https://community.khronos.org/t/getting-gl-stencil-bits-returns-0/75714
But it describes the exact opposite situation: it's Intel HD Graphics 4000 who telling the truth there, while the AMD card is being disingenuous. I found it interesting, as a further proof that this is not such a rare and/or "Intel-specific" case as it may seem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants