GL Gamma Correction
GL Gamma Correction
There are a few different methods to correct gamma:
1. With Quake classic textures (256 shades of brown) or even Half-Life independent palette per texture, you can apply the gamma (and/or contrast) to the color table prior to upload. You cannot do this with a 24-bit image ... at least not reasonably as far as I know because such a table would be rather large (16 MB) and possibly slow/awkward to apply to replacement textures due to large size.
2. Hardware gamma. (Affects entire screen).
3. What modern DarkPlaces does with a shader.
Btw ... I have noticed in every engine that uses #2 (all of the OpenGL ones except DarkPlaces since it uses #3) ---- if I ALT-TAB out of fullscreen and then back into Quake, the hardware gamma correction resets a few seconds later to system defaults (ATI Radion Mobility) on both Windows and the Mac.
I'm trying devise a way to intercept this brightness change, as it irritates me. I've noticed it is associated with ChangeDisplaySettings.
1. With Quake classic textures (256 shades of brown) or even Half-Life independent palette per texture, you can apply the gamma (and/or contrast) to the color table prior to upload. You cannot do this with a 24-bit image ... at least not reasonably as far as I know because such a table would be rather large (16 MB) and possibly slow/awkward to apply to replacement textures due to large size.
2. Hardware gamma. (Affects entire screen).
3. What modern DarkPlaces does with a shader.
Btw ... I have noticed in every engine that uses #2 (all of the OpenGL ones except DarkPlaces since it uses #3) ---- if I ALT-TAB out of fullscreen and then back into Quake, the hardware gamma correction resets a few seconds later to system defaults (ATI Radion Mobility) on both Windows and the Mac.
I'm trying devise a way to intercept this brightness change, as it irritates me. I've noticed it is associated with ChangeDisplaySettings.
The night is young. How else can I annoy the world before sunsrise? Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
Re: GL Gamma Correction
I added a gamma setting on in and out of focus, um i think in key events..
now the only time the desktop is off is if in a window and gamma isnt 1
now the only time the desktop is off is if in a window and gamma isnt 1
Re: GL Gamma Correction
I test the gamma and contrast with shaders. It turns out not bad. I think to remove the hardware gammaramp
simple shader for postprocess
simple shader for postprocess
Code: Select all
#define GammaCorrection(color, gamma) pow(color, 1.0 / gamma)
vec3 ContrastSaturationBrightness(vec3 color, float brt, float sat, float con)
{
// Increase or decrease theese values to adjust r, g and b color channels seperately
const float AvgLumR = 0.5;
const float AvgLumG = 0.5;
const float AvgLumB = 0.5;
const vec3 LumCoeff = vec3(0.2125, 0.7154, 0.0721);
vec3 AvgLumin = vec3(AvgLumR, AvgLumG, AvgLumB);
vec3 brtColor = color * brt;
vec3 intensity = vec3(dot(brtColor, LumCoeff));
vec3 satColor = mix(intensity, brtColor, sat);
vec3 conColor = mix(AvgLumin, satColor, con);
return conColor;
}
Re: GL Gamma Correction
Interesting ....Barnes wrote:simple shader for postprocess
Reminds me I need to look around your source code for some of the great things you've done, like with lighting.
The night is young. How else can I annoy the world before sunsrise? Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
Re: GL Gamma Correction
final postprocess shader (work very fine)
Code: Select all
void main (void) {
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
uniform sampler2DRect u_ScreenTex;
uniform float u_gamma;
uniform float u_brightnes;
uniform float u_contrast;
uniform float u_saturation;
vec3 BrightnesContrastSaturation(vec3 color, float brt, float con, float sat)
{
// Increase or decrease theese values to adjust r, g and b color channels seperately
const float AvgLumR = 0.5;
const float AvgLumG = 0.5;
const float AvgLumB = 0.5;
const vec3 LumCoeff = vec3(0.2125, 0.7154, 0.0721);
vec3 AvgLumin = vec3(AvgLumR, AvgLumG, AvgLumB);
vec3 brtColor = color * brt;
vec3 intensity = vec3(dot(brtColor, LumCoeff));
vec3 satColor = mix(intensity, brtColor, sat);
vec3 conColor = mix(AvgLumin, satColor, con);
return conColor;
}
void main(void)
{
vec4 color = texture2DRect(u_ScreenTex, gl_FragCoord.xy);
color.rgb = BrightnesContrastSaturation(color, u_brightnes, u_contrast, u_saturation);
gl_FragColor = pow(color, 1.0 / u_gamma);
}
Re: GL Gamma Correction
ahh yes!!!
photoshop math on glsl and hlsl (im use some things)
http://mouaif.wordpress.com/2009/01/05/ ... l-shaders/
photoshop math on glsl and hlsl (im use some things)
http://mouaif.wordpress.com/2009/01/05/ ... l-shaders/
Re: GL Gamma Correction
It is getting far more likely I mess around with shaders in the near future.
Thanks!
Thanks!
The night is young. How else can I annoy the world before sunsrise? Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
Re: GL Gamma Correction
Any time
Re: GL Gamma Correction
Or just don't use an ATI card, as this is very much an ATI-driver-specific issue.
Still, shader-based gamma works nicely with windows (I don't mean ms windows... damn them and their too-generic names).
Still, shader-based gamma works nicely with windows (I don't mean ms windows... damn them and their too-generic names).
Re: GL Gamma Correction
Macbook Pro + Compaq Windows laptop.Spike wrote:Or just don't use an ATI card, as this is very much an ATI-driver-specific issue.
I have no choice in the matter!!! I should probably allow the natural evolution of drivers to fix this and deal with it.
The night is young. How else can I annoy the world before sunsrise? Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
Re: GL Gamma Correction
Strangely enough, I wrote some code to check the system gamma ramp to see if it changed. And the ramp isn't changing so the ATI drivers are lying. And setting the same gamma ramp again doesn't work because "it didn't change" (or so the API thinks I imagine).
So what I am doing is set a timer for 3 seconds after any ChangeDisplaySettings and after 3 seconds set the gamma twice (once to system, once to user) and clear the timer. Somewhat stupid workaround by this irritates me because if I press ALT-ENTER to swap my engine from fullscreen to windowed mode or the reverse, I do want the screen brightness to be right.
I'll live with this intermediate solution until whenever I have the time to do the messing around to do shader to do gamma correction.
(I imagine ATI did this because gamers who had a game crash would be stuck with the wrong screen brightness in various games --- so I get why they did it. )
So what I am doing is set a timer for 3 seconds after any ChangeDisplaySettings and after 3 seconds set the gamma twice (once to system, once to user) and clear the timer. Somewhat stupid workaround by this irritates me because if I press ALT-ENTER to swap my engine from fullscreen to windowed mode or the reverse, I do want the screen brightness to be right.
I'll live with this intermediate solution until whenever I have the time to do the messing around to do shader to do gamma correction.
(I imagine ATI did this because gamers who had a game crash would be stuck with the wrong screen brightness in various games --- so I get why they did it. )
The night is young. How else can I annoy the world before sunsrise? Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
Re: GL Gamma Correction
fix for ati cards
Code: Select all
uniform sampler2DRect u_ScreenTex;
uniform float u_gamma;
uniform float u_brightnes;
uniform float u_contrast;
uniform float u_saturation;
vec3 BrightnesContrastSaturation(vec3 color, float brt, float con, float sat)
{
// Increase or decrease theese values to adjust r, g and b color channels seperately
const float AvgLumR = 0.5;
const float AvgLumG = 0.5;
const float AvgLumB = 0.5;
const vec3 LumCoeff = vec3(0.2125, 0.7154, 0.0721);
vec3 AvgLumin = vec3(AvgLumR, AvgLumG, AvgLumB);
vec3 brtColor = color * brt;
vec3 intensity = vec3(dot(brtColor, LumCoeff));
vec3 satColor = mix(intensity, brtColor, sat);
vec3 conColor = mix(AvgLumin, satColor, con);
return conColor;
}
void main(void)
{
vec3 color = texture2DRect(u_ScreenTex, gl_FragCoord.xy).rgb;
color = BrightnesContrastSaturation(color, u_brightnes, u_contrast, u_saturation);
gl_FragColor.rgb = pow(color, 1.0 / vec3(u_gamma));
gl_FragColor.a = 1.0;
}
Re: GL Gamma Correction
Are there any gl/dx nq engines except DP that use non-system gamma? I really need something else that is usable in windowed mode w/o affecting the rest of the screen space
Re: GL Gamma Correction
Unless Spike has something in the SVN for FTEQW, DarkPlaces is the only one.SlapMap wrote:Are there any gl/dx nq engines except DP that use non-system gamma? I really need something else that is usable in windowed mode w/o affecting the rest of the screen space
Unless you want to use the -gamma command line parameter in GLQuake (but I doubt that's what you want).
The night is young. How else can I annoy the world before sunsrise? Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
Re: GL Gamma Correction
Hmm an interresting non shader based gamma function i have from one of mh's older engines.
uses opengl's combine extention so will not Work for the software client.
Code: Select all
/*
===================
SCR_SetBrightness
Enables setting of brightness without having to do any mondo fancy shite.
It's assumed that we're in the 2D view for this...
Basically, what it does is multiply framebuffer colours by an incoming constant between 0 and 2
===================
*/
void SCR_SetBrightness (float brightfactor)
{
// divide by 2 cos the blendfunc will sum src and dst
const GLfloat brightblendcolour[4] = {0, 0, 0, 0.5f * brightfactor};
const GLfloat constantwhite[4] = {1, 1, 1, 1};
// don't trust == with floats, don;t bother if it's 1 cos it does nothing to the end result!!!
if (brightfactor > 0.99 && brightfactor < 1.01)
{
return;
}
glColor4fv (constantwhite);
glEnable (GL_BLEND);
glDisable (GL_ALPHA_TEST);
glBlendFunc (GL_DST_COLOR, GL_SRC_COLOR);
// combine hack...
// this is weird cos it uses a texture but actually doesn't - the parameters of the
// combiner function only use the incoming fragment colour and a constant colour...
// you could actually bind any texture you care to mention and get the very same result...
// i've decided not to bind any at all...
glTexEnvf (GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE);
glTexEnvf (GL_TEXTURE_ENV, GL_COMBINE_RGB, GL_MODULATE);
glTexEnvf (GL_TEXTURE_ENV, GL_SOURCE0_RGB, GL_CONSTANT);
glTexEnvf (GL_TEXTURE_ENV, GL_OPERAND0_RGB, GL_SRC_ALPHA);
glTexEnvf (GL_TEXTURE_ENV, GL_SOURCE1_RGB, GL_PRIMARY_COLOR);
glTexEnvf (GL_TEXTURE_ENV, GL_OPERAND1_RGB, GL_SRC_COLOR);
glTexEnvfv (GL_TEXTURE_ENV, GL_TEXTURE_ENV_COLOR, brightblendcolour);
glBegin (GL_QUADS);
glTexCoord2f (0, 0);
glVertex2f (0, 0);
glTexCoord2f (0, 1);
glVertex2f (0, GL_State.OrthoHeight);
glTexCoord2f (1, 1);
glVertex2f (GL_State.OrthoWidth, GL_State.OrthoHeight);
glTexCoord2f (1, 0);
glVertex2f (GL_State.OrthoWidth, 0);
glEnd ();
// restore combiner function colour to white so as not to mess up texture state
glTexEnvfv (GL_TEXTURE_ENV, GL_TEXTURE_ENV_COLOR, constantwhite);
glTexEnvf (GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glDisable (GL_BLEND);
glEnable (GL_ALPHA_TEST);
glColor4fv (constantwhite);
}
Productivity is a state of mind.