Page 1 of 3

dual monitor gamma issues

Posted: Sat Feb 07, 2015 2:55 am
by r00k
so i have dual monitor support in Qrack but when i change the gamma ingame it messes up the other monitor
is there a was to only change the gamma for 1 monitor only in windows?

Re: dual monitor gamma issues

Posted: Sat Feb 07, 2015 8:20 am
by Baker
https://msdn.microsoft.com/en-us/librar ... 85%29.aspx

BOOL WINAPI SetDeviceGammaRamp(HDC hDC, LPVOID lpRamp);
"Specifies the device context of the direct color display board in question."

The api seems designed to be per display adapter. So if you have 2 different device contexts and 2 different display adapters maybe (if the same display adapter, I don't think there is anything you can do which is part of the reason why I moved away from hardware gamma methods in the engine). The wording doesn't look good for single display adapter with dual monitor support.

Re: dual monitor gamma issues

Posted: Sat Feb 07, 2015 9:31 am
by Spike
sucks when running windowed too. the easy solution is to just use glsl.

this way you're not fighting with graphics drivers / user settings. windows won't restrict the ramps you try using. crashing won't leave monitors messed up. and doesn't give windows a chance to lock up for half a second every time you change the gamma ramp (yay! no stalls when you pick up items!).
simply put, win2k broke hardware gamma ramps, and its been unusable ever since.

Re: dual monitor gamma issues

Posted: Sat Feb 07, 2015 7:34 pm
by ericw
I got fed up with hw gamma in Quakespasm - it stopped working for me on Windows recently, and lots of issues on OS X (messes up my screen dimming software) - so tried doing a glsl implementation: https://github.com/ericwa/Quakespasm/co ... 63e96094c2 It's reasonably self-contained and shouldn't be too hard to adapt for another engine if anyone wants.

I realize doing it with render-to-texture is kind of a old-fashioned/inferior way to do it, but I looked at using FBO's and it would have been a lot more work to integrate with the engine. RMQEngine code was handy as a reference on how to setup RTT, I grabbed a few things from there.

Re: dual monitor gamma issues

Posted: Sat Feb 07, 2015 9:30 pm
by Spike
linked code requires npot.
linear texture filtering but no clamp-to-edge clamping requires full npot. plus linear filtering is a smidge more expensive than needed, and is just weird, but in this case has no visual change.
bgra is not uniformly supported.
the glClear is redundant.
changing view+projection matricies is redundant.
affects full screen (so make sure the full hud is redrawn every frame).
might want to do contrast+brightness at the same time. maybe per-channel too.

Re: dual monitor gamma issues

Posted: Sun Feb 08, 2015 10:43 pm
by ericw
Thanks spike, will look into cleaning those things up.

I went for npot just because I don't have any >=GL2.0 hardware without npot, but I guess it would be easy to use a power-of-two texture (the code in RMQEngine that I based this on did use a power-of-two texture for the framebuffer copy.)

GL_BGRA became part of core in GL 1.2 right? I'm checking for GL2 for this codepath so I think that's safe.

>affects full screen (so make sure the full hud is redrawn every frame).
yep, I was bitten by that!

Re: dual monitor gamma issues

Posted: Mon Feb 09, 2015 1:28 am
by Spike
quote from elsewhere:
The R300 and R400-based cards (Radeon 9500+ and X500+) are incapable of generic NPOT usage, despite allegedly supporting OpenGL 2.0 (which requires full support). These cards only allow you to use NPOTs if the texture has no mipmaps.
NV30-based cards (GeForce FX of any kind) are incapable of NPOTs at all, despite allegedly OpenGL 2.0 (which again requires NPOTs). It will do software rendering if you try to use it.
Any hardware beyond that can handle NPOTs of any kind perfectly.

GLES also does not support full npot, but most implementations feature a basic npot extension. 'basic npot' in this case is mipless, clamp-to-edge textures. Note that these should also be hardware accelerated on the two cards mentioned above (even if emulated via rect textures).
Keeping an eye on gles limitations can be good as it helps avoid hitting software emulation, plus potentially aids portability later.

It is possible for a card to support GLSL without GL2 in cases where some other extension isn't supported. Such cards are *really* old, and excludes the pre-gl2 one you might care about - intel winxp drivers.

GL_BGRA is part of microsoft's opengl1.1 stack as well as mandatory in gl1.2 as you mentioned, so is almost universal on desktop gl, but is in no way guarenteed in any version of GLES. Although as GLES tends to require special care with glTexImage anyway, perhaps you won't have reason to care.
Weird side note: BGRA is not supported by d3d10 while RGBA support is mandatory.

Re: dual monitor gamma issues

Posted: Mon Feb 09, 2015 10:33 am
by mh
GL_ARB_texture_rectangle is another option if GL_ARB_texture_non_power_of_two is not co-operating with you; it's based on GL_NV_texture_rectangle so I'd expect that GeForce FX support should be present (although I haven't tested so I can't confirm).

Re: dual monitor gamma issues

Posted: Fri Feb 20, 2015 10:56 pm
by metlslime
i'm a total noob in shaders, but is all this necessary? I would think you can just draw a textureless quad over the whole screen, with a fragment shader that takes the destination color, modifies it with the gamma function, and then writes it to the output.

Re: dual monitor gamma issues

Posted: Sat Feb 21, 2015 12:16 am
by Spike
despite direct3d's naming, these shaders are not 'pixel' shaders, they're fragment shaders and determine the colour of the fragment rather than any specific pixel on the screen.
To put that another way (ie: a more obvious way), the result of the fragment shader is fed into a blend unit, and its the blend unit (fixed-function) which does the actual pixel output. the prior colour of the framebuffer is not known until *after* the fragment shader has run, in part because multiple fragment shaders for the same pixel can be run before the blend unit is actually given any data (in anticipation of depth rejection etc).
Thus if you want to read the screen you must first copy it to a texture first (this is not a problem assuming you have either npot/rect textures, gl_FragCoord+rect textures doesn't even need a varying).
If you're feeling lazy, you can use glCopyTexImage and copy the framebuffer into a texture instead of needing to set up an fbo. There's a slight performance hit (but not as much as you might think) and of course, you're limited to images no bigger than the framebuffer (but the actual texture can be larger, just sampling only a corner or so).

note that some gles devices DO allow directly reading the framebuffer via the gl_LastFragData array thing (GL_EXT_shader_framebuffer_fetch). However, doing so is only supported because these devices are typically tile-based. Such an extension on desktop GL would enforce synchronosity, basically stalling it, which I'm sure you can understand would be disadvantageous.
Also, this won't help with bloom/blur.

Also note that you can write a contrast 'shader' by utilising the blend units ability to blend according to the destination value twice or so, and brightness is a flat addition, so its really just gamma that needs fancy textures and glsl.

so... no. you can't just read the destination colour and replace it with something better.

Re: dual monitor gamma issues

Posted: Sat Feb 21, 2015 1:22 am
by Baker
metlslime wrote:i'm a total noob in shaders, but is all this necessary? I would think you can just draw a textureless quad over the whole screen, with a fragment shader that takes the destination color, modifies it with the gamma function, and then writes it to the output.
Brightness changes, use quad method. Gamma changes, apply gamma to textures and reupload a few every frame. :lol:

(How often does someone change gamma levels. They find one they like and then it stays there. If they are changing it, they are probably in the menu. I guess I'm saying if it takes 20-30 frames for it to propagate, you won't really notice. :lol: )

[Sounds funny, but the concept is decent. Case in point, the beta Mark V 0/99 in the Func thread has a very sophisticated "levels menu" and "demos menu" that can require a lot of time if you have 2000 maps --- it opens each map file to get the title out (and opens demos to get info too). But it appears instant as it does 17 per frame, where 17 is the number of menu items visible at once so a user wouldn't never know it isn't "real time". :D ]

The idea sounds funny, but I hate screenshots saving that don't reflect the brightness and gamma level of the screen so I already had a function to apply gamma anyway. Haha :D

Re: dual monitor gamma issues

Posted: Sun Feb 22, 2015 9:19 am
by ericw
Baker, that's a cool idea.
As long as you make sure to gamma-correct everything (lightmaps, textures, view-blend colors,..) it should work well.
For cases where there's alpha blending - water, glass, pain flashes - you won't get exactly the same result as doing it on the finished image, right? (0.5 * texture^gamma) + (0.5 * viewblend^gamma) != ((0.5 * texture) + (0.5 * viewblend))^gamma
But I don't imagine it would be noticeable.

Re: dual monitor gamma issues

Posted: Sun Feb 22, 2015 1:51 pm
by Baker
Wouldn't gamma correcting lightmaps and view blends be applying gamma twice?

Hardware gamma is just gamma correcting every pixel on the screen once. Inside a non-leaking Quake map without no clipping into void, every pixel of the screen is covered with textures.

My idea isn't too unique, GLQuake 0.97 or 0.98 added the -gamma parameter which does this once before texture upload.

Re: dual monitor gamma issues

Posted: Sun Feb 22, 2015 3:01 pm
by Spike
https://www.opengl.org/registry/specs/A ... r_sRGB.txt
https://www.opengl.org/registry/specs/E ... e_sRGB.txt

makes the hardware rescale according to typical hardware gamma ramps. this allows you to treat the textures+framebuffer as though they're linear despite pc hardware not being linear.

Re: dual monitor gamma issues

Posted: Sun Feb 22, 2015 6:19 pm
by Baker
Typical shitty OpenGL documentation. Read most of it, looks like texture attribute but NFI on how it would be used nor do they say.

And due to OpenGL's shitty documentation, the standard way to find info on OpenGL is to use a search engine and ignore any results from the official OpenGL web sites and see if you can find actual information.

If the iPhone wasn't invented creating a marketplace for mobile and demand for OpenGL or if Microsoft (who produces great documentation) was an actual software company (instead of a company that specializes in poisoning their own well) there would be no Open GL.

(MSDN has far superior OpenGL documentation than Kronos).