Page 1 of 1

strange thing with vector macros

Posted: Thu Nov 10, 2011 4:50 pm
by revelator
something i noticed while fiddling with vertex lights.

original glColor4f (color[0] * l, color[1] * l, color[2] * l, alpha) things work as expected.

with VectorScale(color, l, color); glColor4f (color[0], color[1], color[2], alpha); which should give the same result the lighting does not change with the angles ??? is this a bug.

Re: strange thing with vector macros

Posted: Thu Nov 10, 2011 5:27 pm
by Spike
depends if color is an int[3] or a vec3_t/float[3]

Re: strange thing with vector macros

Posted: Thu Nov 10, 2011 9:54 pm
by revelator
float [3] same as output so im a little stumped as to why this aint working correctly :/

Re: strange thing with vector macros

Posted: Fri Nov 11, 2011 3:14 am
by r00k
Is vectorScale any faster? why not just go with what works?

Re: strange thing with vector macros

Posted: Fri Nov 11, 2011 3:35 am
by revelator
Might not be any faster but i was planning to use it in several places where things could be made simpler but with these results id be crazy to do it :mrgreen:

strange newer the less.

Re: strange thing with vector macros

Posted: Fri Nov 11, 2011 3:45 am
by andrewj
you've mucked something else up, probably because the 'color' variable got modified by the VectorScale() call.

Re: strange thing with vector macros

Posted: Fri Nov 11, 2011 6:22 am
by revelator
Probably but its weird. See i can do the same routine as the VectorScale macro uses and it works if i restrict it to the function where i want to use it but if i use the global macro it breaks.
I have a strange hunch that some other function using the VectorScale macro is bleeding data and corrupting the output.

Maybe if i write the output to a new vec3_t initialized to 0 it will work ? worth a try.

something like
vec3_t rgb = {0,0,0};
and probably also the lightscale
float l = 0;

l = vertexligtfunc(yadda,yadda,yadda); // nope my code does not look like this :) examplary use only.

VectorScale(color, l, rgb); glColor4f(rgb[0], rgb[1], rgb[2], alpha);

Re: strange thing with vector macros

Posted: Sun Nov 13, 2011 3:39 pm
by revelator
Hmm initializing the vectors to zero fixed it so it seems something was modifying the values.
Thanks for the help though :)

Re: strange thing with vector macros

Posted: Mon Nov 14, 2011 12:03 pm
by mh
You should actually be able to do this without needing a VectorScale at all, and even in the fixed pipeline too. Look at GL_ARB_texture_env_combine - what you'll want is to set shadelight as the environment constant colour, the shadedots lookup as the per-vertex colour, and set up the combiners to evaluate GL_CONSTANT_ARB * GL_PRIMARY_COLOR_ARB * GL_TEXTURE. That would remove the need to do some per-vertex calculations on the CPU so it should run faster, although whether or not it's measurably faster depends on other factors in your code. But all the same, getting things off the CPU and onto the GPU where they belong is the right thing IMO.

This should work on any graphics hardware from a TNT2 upwards. Shaders would of course be better yet again.

Re: strange thing with vector macros

Posted: Mon Nov 14, 2011 12:56 pm
by revelator
Sounds like an interresting way of doing it ill give it a try :) thanks.

Might even be possible to do hardware point lighting by using combiners but that might be a tad slow (idea from looking at some older tenebrae code).

Re: strange thing with vector macros

Posted: Mon Nov 14, 2011 1:39 pm
by mh
You can do attenuation maps using combiners easily enough, but it's not really going to be productive. Overdraw stacks up very quickly with traditionally lit Quake maps - 50x or more in some places in e1m1 for example. It can seem kinda neat to push the limits of the fixed pipeline in this way, but in reality you're coding to hardware that nobody really has anymore, and even if you do identify someone who has such old hardware, this kind of thing is going to be so slow on it anyway that you may as well have not bothered.

Re: strange thing with vector macros

Posted: Mon Nov 14, 2011 2:30 pm
by revelator
i suspected as much. hmm i really need to read some more on glsl (if i ever get time) im working for the government now :shock:
Btw for a small laugh for once nvidia was not to blame for driver crashes it turned out to be flash in combination with firefox that caused the black screen o death cases :lol:
I suspect were going to see a lot of those edge cases in the future as things get more complicated :)