I did exactly this (brush models with the world) in QF. While in theory sub-models can be instanced, I have yet to encounter an example. I also suspect it would break a vanilla engine. QF assumes sub-models are not instanced. However, even instanced models are drawn together with the world (
this was not easy, but I feel it was worth the effort).
The original reason I made QF draw brush models with the world was so I could get fog working in single texture mode without a gob of duplicated code. One big benefit of doing so is better batching of lightmap updates (and we know how important that can be).
QF's glsl renderer throws
all brush model vertices into the one giant VBO (QF's gl (fixed-function) renderer doesn't use VBOs). This is not a problem thanks to glDrawElements, and bsp's MAX_MAP_VERTS being 65536. It could be a problem for very big maps, but as there are very few instanced bsp model types, and they're all boxes, the map would have to have within a few dozen verts of 65536: pretty rare yet, I believe. Worst comes to worst, update the vertex array offsets (I wonder how expensive that gets).
OK, how to handle moving bsp models (instanced or non-instanced)...
The first thing I did when developing this code (for gl, I hadn't started glsl yet) was to
not transform any entity in the renderer. Instead, I had the client calculate and store the transformation matrix in the entity, and then only when something (origin or angles) changes. This means that a static entity's transform is calculated only once for the duration of the level, rather than every frame. I then used glPushMatrix, glMultMatrixf (or glLoadMatrixf), and glPopMatrix whenever I needed to change transformations.
Now for the trick for getting this to work with drawing brush models with the world. I gave surfaces a transform pointer. Surfaces belonging to the world get a null pointer for the transform. Brush models provide a pointer to the entity's transform matrix for the surface's transform.
Now on to drawing... before running the surface chains, the model view matrix is setup for rendering the world. Then, while running through the surface chains, if the surface's transform pointer is null, nothing special is done. However, if the surface's transform pointer is not null, then the code does a glPushMatrix/glLoadMatrix before drawing the surface polys, then a glPopMatrix after drawing the polys. Since most surfaces will not be split (only water and sky), and sub-models in there "home" position are all untransformed, this is not the most efficient. However, instance model surfaces (probably more common) are always transformed, so the minor bit of inefficiency seems to be a reasonable trade-off for the simpler code.
The messy part of all this (for gl) was getting sky chains to work properly. I also had some issues with instanced models, but that might have been before I figured out why they were a PITA

.
I currently have a problem in the glsl renderer with certain surfaces of sub-models not getting the right transform, but I believe that to be a problem in my recent optimization run. Probably just messed up the uniform load logic somewhere.
One other benefit of getting the gl renderer to draw brush models with the world is it made creating the VBO for glsl much easier as I could abuse the surface chaining code to help me build the lists.
Anyway, unless you have a specific reason to do so (single texture fog, building VBOs...), I have to agree that drawing brush models with the world is more effort than it's worth (about 1% isn't much of a gain). However, moving the transform calcs out of the renderer is easy and worth it.
Here's the commit message for the entity transform patch (tweaked to look good in the forum).
commit 3eb859a88f1c05eb10a8a9e7d6b4f7418d95979a
Author: Bill Currie <
bill@taniwha.org>
Date: Thu Dec 15 12:06:03 2011 +0900
Move the entity transform setup into the clients.
This has several benifits:
- The silly issue with alias model pitches being backwards is kept out
of the renderer (it's a quakec thing: entites do their pitch
backwards, but originally, only alias models were rotated. Hipnotic
did brush entity rotations in the correct direction).
- Angle to frame vector conversions are done only when the entity's
angles vector changes, rather than every frame. This avoids a lot of
unnecessary trig function calls.
- Once transformed, an entity's frame vectors are always available.
However, the vectors are left handed rather than right handed (ie,
forward/left/up instead of forward/right/up): just a matter of
watching the sign. This avoids even more trig calls (flag models in
qw).
- This paves the way for merging brush entity surface rendering with the
world model surface rendering (the actual goal of this patch).
- This also paves the way for using quaternions to represent entity
orientation, as that would be a protocol change.