You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The current implementation of SDL_GL_GetAttribute() is not platform-specific. Fine from a maintainability standpoint, it misses a notorious peculiarity of Windows: the information from drivers that is required not only for the proper operation of user programs but also for system routines, may turn out to be more accurate and plausible, as opposed to what is retrieved by applications alone.
What prompted me to write this request was the fact that on my laptop (Intel HD Graphics 3000, Windows 7 x64, SDL2, don't look at me like that) the call to SDL_GL_GetAttribute(SDL_GL_STENCIL_SIZE,) results in GL_INVALID_OPERATION, although it succeeds when switching to nVIDIA GeForce GT 520MX on the same machine. The stencil buffer was acquired for the GL context via SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 8). Moreover, the 8-bit stencil buffer is supported by both video chips, and my rendering code that uses it seems to work. So I believe there is a bug in the ICD driver.
Of course, I updated my Intel HD Graphics 3000 drivers from stock 8.15.10.2622 to latest 9.17.10.4229, but the behavior remains the same: stencil buffer is available, but information about its bit depth is not accessible through SDL. However, one can ask WGL for this (and here I mean GDI, with which it is inextricably tied), which is anyway necessary for context creation. Well, the modern wglGetPixelFormatAttribivARB() call with WGL_STENCIL_BITS_ARB, as well as the classic DescribePixelFormat() function that populates PIXELFORMATDESCRIPTOR.cStencilBits - both produce the correct answer. Thus, we return here to my initial assumption: if the Intel driver also deceived GDI, it would lead to much bigger problems than a frustrated developer lost somewhere in the Far East.
In the end, my suggestion is to rely in SDL_GL_GetAttribute() not only on information from the OpenGL implementation, but also from the operating system itself. Unfortunately, I really don’t know which of them should be considered a higher priority, and when.
Here is a thread about a quite similar problem: https://community.khronos.org/t/getting-gl-stencil-bits-returns-0/75714
But it describes the exact opposite situation: it's Intel HD Graphics 4000 who telling the truth there, while the AMD card is being disingenuous. I found it interesting, as a further proof that this is not such a rare and/or "Intel-specific" case as it may seem.
The text was updated successfully, but these errors were encountered:
The current implementation of
SDL_GL_GetAttribute()
is not platform-specific. Fine from a maintainability standpoint, it misses a notorious peculiarity of Windows: the information from drivers that is required not only for the proper operation of user programs but also for system routines, may turn out to be more accurate and plausible, as opposed to what is retrieved by applications alone.What prompted me to write this request was the fact that on my laptop (Intel HD Graphics 3000, Windows 7 x64, SDL2, don't look at me like that) the call to
SDL_GL_GetAttribute(SDL_GL_STENCIL_SIZE,)
results inGL_INVALID_OPERATION
, although it succeeds when switching to nVIDIA GeForce GT 520MX on the same machine. The stencil buffer was acquired for the GL context viaSDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 8)
. Moreover, the 8-bit stencil buffer is supported by both video chips, and my rendering code that uses it seems to work. So I believe there is a bug in the ICD driver.Of course, I updated my Intel HD Graphics 3000 drivers from stock 8.15.10.2622 to latest 9.17.10.4229, but the behavior remains the same: stencil buffer is available, but information about its bit depth is not accessible through SDL. However, one can ask WGL for this (and here I mean GDI, with which it is inextricably tied), which is anyway necessary for context creation. Well, the modern
wglGetPixelFormatAttribivARB()
call withWGL_STENCIL_BITS_ARB
, as well as the classicDescribePixelFormat()
function that populatesPIXELFORMATDESCRIPTOR.cStencilBits
- both produce the correct answer. Thus, we return here to my initial assumption: if the Intel driver also deceived GDI, it would lead to much bigger problems than a frustrated developer lost somewhere in the Far East.In the end, my suggestion is to rely in
SDL_GL_GetAttribute()
not only on information from the OpenGL implementation, but also from the operating system itself. Unfortunately, I really don’t know which of them should be considered a higher priority, and when.Here is a thread about a quite similar problem: https://community.khronos.org/t/getting-gl-stencil-bits-returns-0/75714
But it describes the exact opposite situation: it's Intel HD Graphics 4000 who telling the truth there, while the AMD card is being disingenuous. I found it interesting, as a further proof that this is not such a rare and/or "Intel-specific" case as it may seem.
The text was updated successfully, but these errors were encountered: