Page 5 of 7

Posted: Tue May 03, 2011 4:53 pm
by Error
Baker wrote: This is kind of an open question to anyone, but the following sites at QuakeDev at a minimum are really important:

1. The Quake Wiki
2. Quake Expo 2001, 2005, 2006
3. Quake Retexturing Project
4. Quake Matt's Gyro stuff

Is this stuff migrated? #1 and #2 seem particularly important and particular difficult to preserve.
I'm working on the QExpo sites, and I'd like to host Quake Matt's work, as I was a huge supporter of his. I could also take care of the Quake Wiki and Retexturing project. It's just that I feel I need to get permission to host the others.

Posted: Tue May 03, 2011 5:05 pm
by leileilol
quake matt's site is necessary since it's the only place for gyro documentation as gyro never included any offline docs for usage

Posted: Tue May 03, 2011 6:58 pm
by Error
I have a version that has the offline docs....

Posted: Tue May 03, 2011 8:03 pm
by Baker
Looks like the Quake Retexturing Project site is safe. I checked with Moon[Drunk] and it appears those guys already talked to Solecord and it'll be moved to over to QuakeOne.com with the Reforged guys.

Posted: Tue May 03, 2011 8:21 pm
by Feared
I've already backed up QuakeMatt's website (plus all the downloads).
http://bfeared.com/library/index.php?di ... quakematt/

I've backed up most of the user sites (qexpo is big, though so I hadn't had a chance). With the link above you are able to download a tar dump of the current directory you're viewing.

Posted: Wed May 04, 2011 6:31 am
by Sajt
I'll back up all the stuff I can find a password for. This includes the Quake Wiki and probably one or two expos. And this I will do soon!

Posted: Wed May 04, 2011 7:28 am
by apolluwn
Feared wrote: I've backed up most of the user sites (qexpo is big, though so I hadn't had a chance). With the link above you are able to download a tar dump of the current directory you're viewing.
The qexpo stuff really isn't that huge 934mb (about 1/4 of everything)... I'm not sure if you are having this exact issue or any other issue, but qexpo2001.quakedev.com, in particular, could leave the impression that it just won't finish...

I've noticed that you can get stuck downloading booth.php because of the way the page handles the variable n and wget, unfortunately, doesn't have an adequate way of mirroring this kind of behavior.
The -A, -R switches take effect only after it makes the request ...

If anyone is having trouble with this then you could try this script (at your own peril) which should grab everything and fix the links that didn't get converted...

I don't know if this code box will work well with this... but it may save some people a bit of work.
Also, there are certainly more efficient/elegant ways to do this, but it'll give you an idea of what the problem is.

If it ends up that someone hosts the original php code on their webserver then it would be wise to fix this.

-BEWARE- The code tag appears to add whitespace after \'s for unfinished lines. You may need to remove this whitespace, or place it all on a single line.

Code: Select all

#!/usr/local/bin/bash
PERL=$(which perl)
WGET=$(which wget)


echo -e "Mirroring qexpo2001.quakedev.com...";
sleep 2

$WGET -m -k -K -N -rH -R"booth.php" \
-Dqexpo2001.quakedev.com,quakedev.com,ajaysquakesite.co.uk,nosferatuthegame.com \
-E -T 30 -t 1 -Xforums,wiki,phpBB3 --exclude-domains=board.nosferatuthegame.com,facelift.quakedev.com \
http://qexpo2001.quakedev.com/ ; 

echo -n "Done.";
echo -e "\nDownloading booths...";
sleep 2

for (( i=1;i<=70;i+=1 ));
do
  $WGET -m -k -K -N -rH -R"booth.php" \
-Dqexpo2001.quakedev.com,quakedev.com,ajaysquakesite.co.uk,nosferatuthegame.com \
-E -T 30 -t 1 -Xforums,wiki,phpBB3 --exclude-domains=board.nosferatuthegame.com,facelift.quakedev.com \
http://qexpo2001.quakedev.com/booths/booth.php?n=${i} ; 
done

echo -n "Done.";
echo -e "\nFixing links in root dir...";
sleep 2

cd ./qexpo2001.quakedev.com ;

for (( i=1;i<=70;i+=1 ))
do
  $PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/booth.php\?n\=${i}"/.\/booths\/booth.php\%3Fn\=${i}.html"/g;" *.html ; 
done

$PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/createbooth.php"/.\/booths\/createbooth.php.html"/g;" *.html ; 
$PERL -pi -e "s/http:\/\/forums.inside3d.com\/wwwthreads.pl\?action\=list\&Board\=QExpo/http:\/\/forums.inside3d.com\//g;" *.html ;

echo -n "Done.";
echo -e "\nFixing links in booths dir...";
sleep 2

cd ./booths ;
cp login.php.html createbooth.php.html ;

for (( i=1;i<=70;i+=1 ))
do
  find . -name '*.html' -type f -exec $PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/booth.php\?n\=${i}"/.\/booth.php\%3Fn\=${i}.html"/g;" {} \; \
-exec $PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/upload\/${i}/.\/upload\/${i}/g;" {} \; 
done

$PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/createbooth.php"/.\/createbooth.php.html"/g;" *.html ;
$PERL -pi -e "s/http:\/\/forums.inside3d.com\/wwwthreads.pl\?action\=list\&Board\=QExpo/http:\/\/forums.inside3d.com\//g;" *.html ;


echo -n "Done.";
echo -e "\nFixing links in events dir...";
sleep 2

cd ../events ;

find . -name '*.html' -type f -exec $PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/createbooth.php"/..\/booths\/createbooth.php.html"/g;" {} \; \
-exec $PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/login.php"/..\/booths\/login.php"/g;" {} \;
 
$PERL -pi -e "s/http:\/\/forums.inside3d.com\/wwwthreads.pl\?action\=list\&Board\=QExpo/http:\/\/forums.inside3d.com\//g;" *.html ;

cd ../features ;

echo -n "Done.";
echo -e "\nFixing links in features dir...\n";
sleep 2

$PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/createbooth.php"/..\/booths\/createbooth.php.html"/g;" *.html ;
$PERL -pi -e "s/http:\/\/forums.inside3d.com\/wwwthreads.pl\?action\=list\&Board\=QExpo/http:\/\/forums.inside3d.com\//g;" *.html ;

echo -e "All done.\n";
code edit: everything leading back to qexpo2001.quakedev.com should only be because it was a dead link anyways and the forum link is changed to forums.inside3d.com instead of the broken link.

Posted: Wed May 04, 2011 11:07 pm
by Feared
I already did QExpo 2001. I just need to get around to 2005 and 2006. Doing 2005 right now. Hopefully I'll have enough time today to do 2006.

Posted: Thu May 05, 2011 2:42 am
by Entar
So I take it this means that we're opting for archival and migrating hosting, rather than maintaining the existing hosting/domain? Just want to check before I arrange for different hosting.

Posted: Thu May 05, 2011 3:45 am
by Baker
I think many people would like to see those sites rehosted somewhere that is stable and hopefully long-term.

They are an important part of the history of the Quake modding community.

quakedev request

Posted: Wed May 11, 2011 10:43 pm
by Danfun64
hello again :P

Shame that quakedev is dead, because I am looking for a specific file.

synq-alpha-007.7z

It contains a couple models and textures/skins that I find highly valuable. I know that synq is dead but that only makes the files I need harder to find. Please reupload it, at least for my sake?

Re: quakedev request

Posted: Thu May 12, 2011 12:30 am
by Baker
Danfun64 wrote:hello again :P

Shame that quakedev is dead, because I am looking for a specific file.

synq-alpha-007.7z

It contains a couple models and textures/skins that I find highly valuable. I know that synq is dead but that only makes the files I need harder to find. Please reupload it, at least for my sake?
http://synq.quakedev.com/synq-alpha-007.7z

Posted: Mon May 30, 2011 3:05 pm
by Chip
Feared wrote:I've backed up most of the user sites [...]. With the link [...] you are able to download a tar dump of the current directory you're viewing.
Thanks for saving Tremor. I'll take a .tar dump of that too, for my offline use.

Posted: Mon May 30, 2011 3:16 pm
by Chip
@LordHavoc (I tried sending a PM but it wouldn't leave the outbox):

Hey LH, when is quakedev.com expiring? Are you aware of the exact date? Are you aware if anyone wants to take it back?

Also, I tried asking Echon but got no answer. Any chance for cPanel/FTP/MySQL access for forum database? I'd like to save the entire forums.

Thanks.

Posted: Tue May 31, 2011 5:29 am
by LordHavoc
Chip wrote:Hey LH, when is quakedev.com expiring? Are you aware of the exact date? Are you aware if anyone wants to take it back?
I do not know the exact date.

As far as I know no one has offered the next monthly bill to keep the server running.

apolluwn has offered to take over (by migrating the site to a new host and maintaining it).

icculus has offered to archive the entire site in place, but that would not be maintained, simply a frozen archive.

It is my understanding that icculus has made a full backup of the server filesystem (not the user-visible front side but the entire server) a couple weeks ago.

I have a backup of the entire web and svn portions.

I do not know enough about good methods to back up the rest to pursue it.

If you can get me instructions on what files I should archive from the filesystem side, I'll happily grab the forums.

If we are to work out a new maintainership, it had better happen soon.

I can pay the bill a second time if necessary, but I need to be convinced people are going to act on it.