mh wrote:I've always been wary of that PQ synthetic lag thing too. It may be only small buffers, but net communication is subject to the same rules as memory allocation, GPUs and hard disks: lots of small events are orders of magnitude slower than a few large events, even if the total size is the same. The fact that it's never totally clear what JPG was thinking or why he was doing certain things is not good either (there's a good tip about how to write comments correctly in there).
JPG got a PHD at MIT. I'm glad he made the admittedly very "creative" improvements to ProQuake in the short period of time he had to do so. Many are very unelegant, hard to maintain, but they were all backwards compatible. Still, I'd like to remove all the hacks ... but that would be a network protocol change.
And eventually I believe that will happen for a number of reasons.
1. Smaller demo files (sheesh!)
2. Server side multiview recording
3. True compressed downloading
4. Some sort of DP7 like predictive capability where needed.
5. 64 player support .... now why? Well ... I want multiplayer to have a mega hub server where groups of players can walk to, say, the CTF bar and press a button when the group is assembled and ready to play and kabang! --- those clients are forwarded to an appropriate CTF server.
But the one I really want ... peer to peer coop and a supporting backend to support that.
Maybe my goals for 2nd half 2012
I believe open source leads to unlimited community building and expression and invention ... and my thoughts are that if you just build enough infrastructure and user-friendliness you can cause a chain reaction.
I mean if you look at all the cumulative works the greater Q1 community has created ... it is mindblowing. FitzQuake, DP, FTE, your stuff, QER, Quark, Quake Reforged, Quaddicted, Quake Injector, QuakeC tools, mapping tools ... hordes of engine and QuakeC dev and texture projects and on and on ...