Thoughts on Stencil not working on NVidia

Discuss programming topics for the various GPL'd game engine sources.
Post Reply
Baker
Posts: 3666
Joined: Tue Mar 14, 2006 5:15 am

Thoughts on Stencil not working on NVidia

Post by Baker »

I have a procedure that draws the sky by doing the following:
1) Draws the sky brushes with no color (colormask to 0) but writing depth (Stencil op keep where drawn, set stencil to 1)
2) Then draw the normal sky or the sky box where stencil = 1, honoring the depth buffer.

Works fine on ATI and Intel. Fails on NVidia.

I expect to have to solve this myself and I should be able get my hands on a machine with an NVidia card for direct testing.

But wanted to ask around see if I am missing something obvious here ...

Code: Select all

void Sky_Stencil_Draw (void)
{
	int			i;
	msurface_t	*s;
	texture_t	*t;

// Baker: Direct3D doesn't have stencil at this time
	if (!renderer.gl_stencilbits || vid.direct3d)
		return;

// Baker: No sky to draw
	if (!level.sky /*|| !frame.has_sky*/)
		return;

// Baker: Where drawn (doesn't z-fail), replace with 1
// in the stencil buffer.
	eglStencilFunc (GL_ALWAYS, 1, ~0 );
	eglStencilOp (GL_KEEP, GL_KEEP, GL_REPLACE);
	eglEnable (GL_STENCIL_TEST);

// Baker: A no draw pass of the sky brushes, does not even
// write to depth buffer (maybe it should?) that writes
// our stencil overlay.

	eglColorMask (0,0,0,0);
//	eglDepthMask (0);  // Don't write depth to buffer
	eglDisable (GL_TEXTURE_2D);
	for (i=0 ; i<cl.worldmodel->numtextures ; i++)
	{
		t = cl.worldmodel->textures[i];

		if (!t || !t->texturechain || !(t->texturechain->flags & SURF_DRAWSKY))
			continue;

		for (s = t->texturechain; s; s = s->texturechain)
//			if (!s->culled)
			{
				DrawGLPoly (s->polys, 0); // Not here.
				rs_brushpasses++;
			}
	}
	eglEnable (GL_TEXTURE_2D);
	eglColorMask (1,1,1,1);
//	eglDepthMask (1);

// Baker: Keep any pixels where stencil wasn't drawn
// for this drawing pass.
	eglStencilOp( GL_KEEP, GL_KEEP, GL_KEEP );
	eglStencilFunc( GL_EQUAL, 1, ~0 );

// Baker: Now draw the stencil
	Sky_DrawSky ();

// Turn it off
	eglDisable (GL_STENCIL_TEST);
	eglClear (GL_STENCIL_BUFFER_BIT);
}
The night is young. How else can I annoy the world before sunsrise? 8) Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
Spike
Posts: 2914
Joined: Fri Nov 05, 2004 3:12 am
Location: UK
Contact:

Re: Thoughts on Stencil not working on NVidia

Post by Spike »

if glGetIntegerv(GL_STENCIL_BITS) is returning 0, then you messed up your context creation.
commonly your depth+stencil buffers are a single buffer 24.8.
of course, gl treats them separately which results in some fun...
some hardware actually supports 32bit depth buffers!...
but of course, depth+stencil buffer then means that this then means that you have 0bit stencil buffers.

so make sure you're actually requesting a stencil buffer at context creation.

that's my guess.
Baker
Posts: 3666
Joined: Tue Mar 14, 2006 5:15 am

Re: Thoughts on Stencil not working on NVidia

Post by Baker »

Spike wrote:if glGetIntegerv(GL_STENCIL_BITS) is returning 0, then you messed up your context creation.
commonly your depth+stencil buffers are a single buffer 24.8.
of course, gl treats them separately which results in some fun...
some hardware actually supports 32bit depth buffers!...
but of course, depth+stencil buffer then means that this then means that you have 0bit stencil buffers.

so make sure you're actually requesting a stencil buffer at context creation.

that's my guess.
Seems like a solid lead. Supposedly my code path draws the sky the "old way" if stencilbits aren't present and I am requesting a 8-bit stencil buffer.

But I don't recall anything about testing this (seems like something I would have had to test thoroughly, but I have no memory of it -- which isn't unusual.)

I'll keep an eye out for what kind of depth buffer I receive too. I see I am requesting a 32-bit depth buffer ... hmmm ... I could be hitting a 32 + 0 scenario then :(
The night is young. How else can I annoy the world before sunsrise? 8) Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
mh
Posts: 2292
Joined: Sat Jan 12, 2008 1:38 am

Re: Thoughts on Stencil not working on NVidia

Post by mh »

Modern hardware should be able to support 32-bit depth with stencil in it's own buffer, so it would probably also depend on which NV it was failing on.

In truth 24/8 is the format that's most likely to work on everything, so it's the safest default.

Another way of drawing sky is, IIRC, to draw all the sky polys with colormask 0 and depthmask 1, then draw a sky box or sphere with z-fail, depthrange 1,1 and depthmask 0, then draw everything else, which will ensure that sky is only where it should be. Z-fail is not particularly hardware-friendly though, but this way doesn't require stencil at all.

The best way is of course to just draw the sky polys in-place with a shader.
We had the power, we had the space, we had a sense of time and place
We knew the words, we knew the score, we knew what we were fighting for
Baker
Posts: 3666
Joined: Tue Mar 14, 2006 5:15 am

Re: Thoughts on Stencil not working on NVidia

Post by Baker »

I have the ability to draw mirrors, so I can't control when in the rendering it will draw the sky. At least not for the mirror.

And for fairness purposes, since I can't control when I draw the sky for the mirror, I made it so the sky gets drawn a bit later and should Z-fail to catch things like this problem without any rendering issues only cropping up when mirrors are visible (I mean, what uses mirrors? Nothing currently, hehe. :D So if I want the feature to work, I want the general usage of non-mirrors to fail so the mirrors feature can be trusted. )

There is the chance that I'm not zfailing the sky on NVidia somehow.

Because on bugged screenshots, the sky is stomping foreground brushes/surfaces.

I will find out what is doing this. Perhaps I may solve in the next 2 days. I mostly need to have the hardware available because every Windows and Mac I have is either ATI or Intel.
The night is young. How else can I annoy the world before sunsrise? 8) Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
ericw
Posts: 92
Joined: Sat Jan 18, 2014 2:11 am

Re: Thoughts on Stencil not working on NVidia

Post by ericw »

I have two laptops with NVidia (9400m, 650GT) I can offer for testing if you need it.

With the Dec 24 OSX build, sky looks fine, but if I set r_oldwater to 0, liquids are rendered solid white. Not sure if that is related or a different issue.
Baker
Posts: 3666
Joined: Tue Mar 14, 2006 5:15 am

Re: Thoughts on Stencil not working on NVidia

Post by Baker »

ericw wrote:I have two laptops with NVidia (9400m, 650GT) I can offer for testing if you need it.
Yeah, I'd appreciate that. I should kill off the r_oldwater issue by Monday. I need to look and see who was experiencing the sky drawing issue because if you aren't experiencing it, I'm not sure I know of a way to get my hands on a very specific Nvidia that has the problem.
The night is young. How else can I annoy the world before sunsrise? 8) Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
ericw
Posts: 92
Joined: Sat Jan 18, 2014 2:11 am

Re: Thoughts on Stencil not working on NVidia

Post by ericw »

This was solved, right? Was it due to not clearing the stencil buffer at the start of sky drawing, or interference from the shadow code?
Baker
Posts: 3666
Joined: Tue Mar 14, 2006 5:15 am

Re: Thoughts on Stencil not working on NVidia

Post by Baker »

ericw wrote:This was solved, right? Was it due to not clearing the stencil buffer at the start of sky drawing, or interference from the shadow code?
Stencil shadow code cleared the buffer and used the stencil in a way contrary (opposite) to the sky drawing.

Spike kindly fixed the problem.

Both the stencil sky and the stencil shadow code work really great independently, but I did them at separate times (2 years apart) and I almost never use shadows nor thought of testing with shadows. (Possibly made harder to experience with gl_clear set to 1 which hid the problem from me by clearing the stencil buffer).

Any Mark V in the last month or 2 has this corrected in the source.
The night is young. How else can I annoy the world before sunsrise? 8) Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
ericw
Posts: 92
Joined: Sat Jan 18, 2014 2:11 am

Re: Thoughts on Stencil not working on NVidia

Post by ericw »

Cool, I borrowed the stencil shadow code from one of the recent MarkV source releases and added it to Quakespasm. I did see this line and took that as well:

Code: Select all

eglStencilFunc(GL_EQUAL, 0, ~0); // Stencil Sky Fix - 2015 April 13 Per Spike
Nice having the solid looking shadows without the ugly self-intersecting bits for the occasional mod that enables them by default (e.g. Warpspasm) :)
Post Reply