Colored lighting faithful to the original colormap?
Colored lighting faithful to the original colormap?
The main reason why I never implemented colored lighting is because I don't want the engine to lose this:
See how the mono lighting makes the walls go from beige to purple, to blue, and to green before going fully black?
It would be easy to blend that lighting with a single colored light, but if multiple colored lights with different colors intersect each other (e.g. a blue light from the quad and a red light from a rocket) and also with a mono light (usually from the BSP's lightmap), I couldn't think of a good way to preserve the color artifacts of the colormap while gradually blending it with the multiple colors of the colored lights.
Any ideas?
See how the mono lighting makes the walls go from beige to purple, to blue, and to green before going fully black?
It would be easy to blend that lighting with a single colored light, but if multiple colored lights with different colors intersect each other (e.g. a blue light from the quad and a red light from a rocket) and also with a mono light (usually from the BSP's lightmap), I couldn't think of a good way to preserve the color artifacts of the colormap while gradually blending it with the multiple colors of the colored lights.
Any ideas?
Re: Colored lighting faithful to the original colormap?
if you're staying 8-bit, the palette is too useless to do coloured lighting (though you might be able to get away with it with dithering, but thats ugly in its own right).
if you want to retain the discolouration, you could probably achieve that by applying some non-linear distortion of the input lightmaps. increase green values at the low end and boost red at the highish end.
or just ask yourself if discoloured banding is really what you like seeing...
if you're doing direct-to-rgb rendering (ie: paletted texture, rgb framebuffer), just do the colormap[lightmap][texture] lookup 3 times per pixel? you'll get the same result for white lights while low green lights will be brighter than low red lights. plus you'll still have the hideous banding thing (although yes, you may get 3 times as much banding as each light channel starts to band at a different point, but as making it ugly appears to be your goal this is somewhat of a moot point and may actually help image quality much like dithering does)...
if you want to retain the discolouration, you could probably achieve that by applying some non-linear distortion of the input lightmaps. increase green values at the low end and boost red at the highish end.
or just ask yourself if discoloured banding is really what you like seeing...
if you're doing direct-to-rgb rendering (ie: paletted texture, rgb framebuffer), just do the colormap[lightmap][texture] lookup 3 times per pixel? you'll get the same result for white lights while low green lights will be brighter than low red lights. plus you'll still have the hideous banding thing (although yes, you may get 3 times as much banding as each light channel starts to band at a different point, but as making it ugly appears to be your goal this is somewhat of a moot point and may actually help image quality much like dithering does)...
Re: Colored lighting faithful to the original colormap?
I wouldn't call it "ugly", but "gritty". With dithering, it actually looks really good to me:Spike wrote:making it ugly appears to be your goal
Losing this hue banding makes Quake look too bland, too clean. The hue banding gives more personality to the game, making the darker places look more dirty. Losing this "dirty darkness" look is a loss of visual detail, which is unacceptable to me.
Yes, that's how I'd implement colored lighting.Spike wrote:if you're doing direct-to-rgb rendering (ie: paletted texture, rgb framebuffer)
Hmm, that's something I'd have to see in action to know if it helps without making the lighting more bland. Gotta install some programming tools on this new laptop...Spike wrote:you may get 3 times as much banding as each light channel starts to band at a different point, but [...] may actually help image quality much like dithering does
That's something I'd like to avoid. Colored lighting shouldn't have banding. Plus, "GL" players are used to uniformly-shaded colored lighting.Spike wrote:low green lights will be brighter than low red lights
So, I'd like to have the mono lighting with the hue banding of the colormap, while simultaneously having the colored lighting with no banding. Figuring out how a blend of them should look is part of the challenge. The other is to implement this in a fast way, so the mono lighting don't get slower to render and the colored lighting don't make it much slower.
Re: Colored lighting faithful to the original colormap?
Probably a good way to achieve what I'm after is to blend all the lights, calculate the intensity of the resulting saturation, calculate the hue banding inversely proportional to the intensity of the saturation and scaled to the hue banding of the colormap, and apply the resulting banding to the hue of the lighting.
Re: Colored lighting faithful to the original colormap?
This is why I chose to use a big 18-bit translation table (generated using the same BestColor function that made the colormap). Remember that 8-bit color (in VGA) only has 18-bit color precision, anything more than that is a waste. You'd probably have to do something about BuildLightmapRGB halving precision. Note that Engoo shifts the lightmap a tiny bit brighter to avoid artifacts, so that might have also affected the banding
i should not be here
Re: Colored lighting faithful to the original colormap?
I never understood this 18-bit thing, until I read this:leileilol wrote:This is why I chose to use a big 18-bit translation table. Remember that 8-bit color (in VGA) only has 18-bit color precision, anything more than that is a waste.
I don't care about VGA hardware limitations. The palette itself is in 24-bit (8 bits per color channel), and I don't plan to run the game on any device limited to 256 colors.the VGA only gives us 6 bits per color channel, so the best you can get is 18-bit color (but you can only pick 256 of those colors, of course). [...] the program needs to change the palette data form 24-bit to 18-bit (divide each color by four, or right-shift by two).
Also, I don't know if any of the colors in Quake's default palette has any of the two rightmost bits set to 1, but the engine should be able to handle any custom palette that uses them.
Re: Colored lighting faithful to the original colormap?
I think what was meant by the VGA compatibility is essentially DOSBox compatibility, which should be VGA as far as I know (did they have 16-bit color DOS games ever in the late 90s?).
(And has the nicety of being able on any platform that supports DOSBox which is almost everything)
(And has the nicety of being able on any platform that supports DOSBox which is almost everything)
The night is young. How else can I annoy the world before sunsrise? Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
Re: Colored lighting faithful to the original colormap?
An additive lighting mix table with saturation scaling already built-in might be able to handle it. Light blending is additive so it's not necessary to wait until the final result. In color lookup (bestcolor), banding increases as 'value' becomes more important than 'hue'.mankrip wrote:Probably a good way to achieve what I'm after is to blend all the lights, calculate the intensity of the resulting saturation, calculate the hue banding inversely proportional to the intensity of the saturation and scaled to the hue banding of the colormap, and apply the resulting banding to the hue of the lighting.
Re: Colored lighting faithful to the original colormap?
depends how late in the 1990's cards like the S3 virge and tseng labs 4000 did have more than 16 bit color
(actually they could both handle 32 bit color but they where so slow that they could not handle a game at that colorsetting).
there where also vesa cards able to run games at 256 color depths.
have a look here http://www.maximumpc.com/article/featur ... d_graphics
(actually they could both handle 32 bit color but they where so slow that they could not handle a game at that colorsetting).
there where also vesa cards able to run games at 256 color depths.
have a look here http://www.maximumpc.com/article/featur ... d_graphics
Productivity is a state of mind.
Re: Colored lighting faithful to the original colormap?
AGREED.With dithering, it actually looks really good to me:
Actually IMHO that banding, even in glQuake, looks butt ugly. I leave the 'gritty' look to gl_texturemode...Losing this hue banding makes Quake look too bland, too clean. The hue banding gives more personality to the game, making the darker places look more dirty. Losing this "dirty darkness" look is a loss of visual detail, which is unacceptable to me.
Re: Colored lighting faithful to the original colormap?
Probably could recreate it in GLQuake by linearly resampling the lightmap upward x 16, nearest filtering AND palettizing it to the Quake palette
i should not be here
Re: Colored lighting faithful to the original colormap?
Hmm, that wouldn't work for 24-bit textures.mankrip wrote:Probably a good way to achieve what I'm after is to blend all the lights, calculate the intensity of the resulting saturation, calculate the hue banding inversely proportional to the intensity of the saturation and scaled to the hue banding of the colormap, and apply the resulting banding to the hue of the lighting.
A way to hack it in would be to generate 8-bit copies of all direct color textures, and then use that for getting the indexed colors that should be used to calculate the hue banding.
A simpler, faster and more "natural" way to do it, but that would probably be less accurate, would be to generate an universal hue banding table by averaging the hue banding of all indexed colors for each shading level of the colormap.