QuakeDev.com downtime and request for archival

Discuss anything not covered by any of the other categories.
Error
InsideQC Staff
Posts: 865
Joined: Fri Nov 05, 2004 5:15 am
Location: VA, USA
Contact:

Post by Error »

Baker wrote: This is kind of an open question to anyone, but the following sites at QuakeDev at a minimum are really important:

1. The Quake Wiki
2. Quake Expo 2001, 2005, 2006
3. Quake Retexturing Project
4. Quake Matt's Gyro stuff

Is this stuff migrated? #1 and #2 seem particularly important and particular difficult to preserve.
I'm working on the QExpo sites, and I'd like to host Quake Matt's work, as I was a huge supporter of his. I could also take care of the Quake Wiki and Retexturing project. It's just that I feel I need to get permission to host the others.
leileilol
Posts: 2783
Joined: Fri Oct 15, 2004 3:23 am

Post by leileilol »

quake matt's site is necessary since it's the only place for gyro documentation as gyro never included any offline docs for usage
i should not be here
Error
InsideQC Staff
Posts: 865
Joined: Fri Nov 05, 2004 5:15 am
Location: VA, USA
Contact:

Post by Error »

I have a version that has the offline docs....
Baker
Posts: 3666
Joined: Tue Mar 14, 2006 5:15 am

Post by Baker »

Looks like the Quake Retexturing Project site is safe. I checked with Moon[Drunk] and it appears those guys already talked to Solecord and it'll be moved to over to QuakeOne.com with the Reforged guys.
The night is young. How else can I annoy the world before sunsrise? 8) Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
Feared
Posts: 95
Joined: Fri Jun 11, 2010 11:58 pm
Location: Wylie, TX

Post by Feared »

I've already backed up QuakeMatt's website (plus all the downloads).
http://bfeared.com/library/index.php?di ... quakematt/

I've backed up most of the user sites (qexpo is big, though so I hadn't had a chance). With the link above you are able to download a tar dump of the current directory you're viewing.
Sajt
Posts: 1215
Joined: Sat Oct 16, 2004 3:39 am

Post by Sajt »

I'll back up all the stuff I can find a password for. This includes the Quake Wiki and probably one or two expos. And this I will do soon!
F. A. Špork, an enlightened nobleman and a great patron of art, had a stately Baroque spa complex built on the banks of the River Labe.
apolluwn
Posts: 35
Joined: Tue Apr 26, 2011 11:57 pm

Post by apolluwn »

Feared wrote: I've backed up most of the user sites (qexpo is big, though so I hadn't had a chance). With the link above you are able to download a tar dump of the current directory you're viewing.
The qexpo stuff really isn't that huge 934mb (about 1/4 of everything)... I'm not sure if you are having this exact issue or any other issue, but qexpo2001.quakedev.com, in particular, could leave the impression that it just won't finish...

I've noticed that you can get stuck downloading booth.php because of the way the page handles the variable n and wget, unfortunately, doesn't have an adequate way of mirroring this kind of behavior.
The -A, -R switches take effect only after it makes the request ...

If anyone is having trouble with this then you could try this script (at your own peril) which should grab everything and fix the links that didn't get converted...

I don't know if this code box will work well with this... but it may save some people a bit of work.
Also, there are certainly more efficient/elegant ways to do this, but it'll give you an idea of what the problem is.

If it ends up that someone hosts the original php code on their webserver then it would be wise to fix this.

-BEWARE- The code tag appears to add whitespace after \'s for unfinished lines. You may need to remove this whitespace, or place it all on a single line.

Code: Select all

#!/usr/local/bin/bash
PERL=$(which perl)
WGET=$(which wget)


echo -e "Mirroring qexpo2001.quakedev.com...";
sleep 2

$WGET -m -k -K -N -rH -R"booth.php" \
-Dqexpo2001.quakedev.com,quakedev.com,ajaysquakesite.co.uk,nosferatuthegame.com \
-E -T 30 -t 1 -Xforums,wiki,phpBB3 --exclude-domains=board.nosferatuthegame.com,facelift.quakedev.com \
http://qexpo2001.quakedev.com/ ; 

echo -n "Done.";
echo -e "\nDownloading booths...";
sleep 2

for (( i=1;i<=70;i+=1 ));
do
  $WGET -m -k -K -N -rH -R"booth.php" \
-Dqexpo2001.quakedev.com,quakedev.com,ajaysquakesite.co.uk,nosferatuthegame.com \
-E -T 30 -t 1 -Xforums,wiki,phpBB3 --exclude-domains=board.nosferatuthegame.com,facelift.quakedev.com \
http://qexpo2001.quakedev.com/booths/booth.php?n=${i} ; 
done

echo -n "Done.";
echo -e "\nFixing links in root dir...";
sleep 2

cd ./qexpo2001.quakedev.com ;

for (( i=1;i<=70;i+=1 ))
do
  $PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/booth.php\?n\=${i}"/.\/booths\/booth.php\%3Fn\=${i}.html"/g;" *.html ; 
done

$PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/createbooth.php"/.\/booths\/createbooth.php.html"/g;" *.html ; 
$PERL -pi -e "s/http:\/\/forums.inside3d.com\/wwwthreads.pl\?action\=list\&Board\=QExpo/http:\/\/forums.inside3d.com\//g;" *.html ;

echo -n "Done.";
echo -e "\nFixing links in booths dir...";
sleep 2

cd ./booths ;
cp login.php.html createbooth.php.html ;

for (( i=1;i<=70;i+=1 ))
do
  find . -name '*.html' -type f -exec $PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/booth.php\?n\=${i}"/.\/booth.php\%3Fn\=${i}.html"/g;" {} \; \
-exec $PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/upload\/${i}/.\/upload\/${i}/g;" {} \; 
done

$PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/createbooth.php"/.\/createbooth.php.html"/g;" *.html ;
$PERL -pi -e "s/http:\/\/forums.inside3d.com\/wwwthreads.pl\?action\=list\&Board\=QExpo/http:\/\/forums.inside3d.com\//g;" *.html ;


echo -n "Done.";
echo -e "\nFixing links in events dir...";
sleep 2

cd ../events ;

find . -name '*.html' -type f -exec $PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/createbooth.php"/..\/booths\/createbooth.php.html"/g;" {} \; \
-exec $PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/login.php"/..\/booths\/login.php"/g;" {} \;
 
$PERL -pi -e "s/http:\/\/forums.inside3d.com\/wwwthreads.pl\?action\=list\&Board\=QExpo/http:\/\/forums.inside3d.com\//g;" *.html ;

cd ../features ;

echo -n "Done.";
echo -e "\nFixing links in features dir...\n";
sleep 2

$PERL -pi -e "s/http:\/\/qexpo2001.quakedev.com\/booths\/createbooth.php"/..\/booths\/createbooth.php.html"/g;" *.html ;
$PERL -pi -e "s/http:\/\/forums.inside3d.com\/wwwthreads.pl\?action\=list\&Board\=QExpo/http:\/\/forums.inside3d.com\//g;" *.html ;

echo -e "All done.\n";
code edit: everything leading back to qexpo2001.quakedev.com should only be because it was a dead link anyways and the forum link is changed to forums.inside3d.com instead of the broken link.
Feared
Posts: 95
Joined: Fri Jun 11, 2010 11:58 pm
Location: Wylie, TX

Post by Feared »

I already did QExpo 2001. I just need to get around to 2005 and 2006. Doing 2005 right now. Hopefully I'll have enough time today to do 2006.
Entar
Posts: 439
Joined: Fri Nov 05, 2004 7:27 pm
Location: At my computer
Contact:

Post by Entar »

So I take it this means that we're opting for archival and migrating hosting, rather than maintaining the existing hosting/domain? Just want to check before I arrange for different hosting.
Last edited by Entar on Sat May 07, 2011 3:07 pm, edited 1 time in total.
Baker
Posts: 3666
Joined: Tue Mar 14, 2006 5:15 am

Post by Baker »

I think many people would like to see those sites rehosted somewhere that is stable and hopefully long-term.

They are an important part of the history of the Quake modding community.
The night is young. How else can I annoy the world before sunsrise? 8) Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
Danfun64
Posts: 16
Joined: Tue Apr 06, 2010 2:47 pm

quakedev request

Post by Danfun64 »

hello again :P

Shame that quakedev is dead, because I am looking for a specific file.

synq-alpha-007.7z

It contains a couple models and textures/skins that I find highly valuable. I know that synq is dead but that only makes the files I need harder to find. Please reupload it, at least for my sake?
Baker
Posts: 3666
Joined: Tue Mar 14, 2006 5:15 am

Re: quakedev request

Post by Baker »

Danfun64 wrote:hello again :P

Shame that quakedev is dead, because I am looking for a specific file.

synq-alpha-007.7z

It contains a couple models and textures/skins that I find highly valuable. I know that synq is dead but that only makes the files I need harder to find. Please reupload it, at least for my sake?
http://synq.quakedev.com/synq-alpha-007.7z
The night is young. How else can I annoy the world before sunsrise? 8) Inquisitive minds want to know ! And if they don't -- well like that ever has stopped me before ..
Chip
Posts: 575
Joined: Wed Jan 21, 2009 9:12 am
Location: Dublin, Ireland
Contact:

Post by Chip »

Feared wrote:I've backed up most of the user sites [...]. With the link [...] you are able to download a tar dump of the current directory you're viewing.
Thanks for saving Tremor. I'll take a .tar dump of that too, for my offline use.
QuakeWiki
getButterfly - WordPress Support Services
Roo Holidays

Fear not the dark, but what the dark hides.
Chip
Posts: 575
Joined: Wed Jan 21, 2009 9:12 am
Location: Dublin, Ireland
Contact:

Post by Chip »

@LordHavoc (I tried sending a PM but it wouldn't leave the outbox):

Hey LH, when is quakedev.com expiring? Are you aware of the exact date? Are you aware if anyone wants to take it back?

Also, I tried asking Echon but got no answer. Any chance for cPanel/FTP/MySQL access for forum database? I'd like to save the entire forums.

Thanks.
QuakeWiki
getButterfly - WordPress Support Services
Roo Holidays

Fear not the dark, but what the dark hides.
LordHavoc
Posts: 322
Joined: Fri Nov 05, 2004 3:12 am
Location: western Oregon, USA
Contact:

Post by LordHavoc »

Chip wrote:Hey LH, when is quakedev.com expiring? Are you aware of the exact date? Are you aware if anyone wants to take it back?
I do not know the exact date.

As far as I know no one has offered the next monthly bill to keep the server running.

apolluwn has offered to take over (by migrating the site to a new host and maintaining it).

icculus has offered to archive the entire site in place, but that would not be maintained, simply a frozen archive.

It is my understanding that icculus has made a full backup of the server filesystem (not the user-visible front side but the entire server) a couple weeks ago.

I have a backup of the entire web and svn portions.

I do not know enough about good methods to back up the rest to pursue it.

If you can get me instructions on what files I should archive from the filesystem side, I'll happily grab the forums.

If we are to work out a new maintainership, it had better happen soon.

I can pay the bill a second time if necessary, but I need to be convinced people are going to act on it.
Post Reply