I did this little write up because
I thought some of you guys may be interested in reading what goes
in to running the site. The next couple of pages shows the sequence
in which everything was done to take Itchy and Scratchy - Orsm.net's
new servers - from bits and pieces in boxes and bags to becoming
two kick ass servers.
Late last year I started tinkering with the idea
of adding a second server to run the site. Things were just starting
to slow down and the bandwidth the server plugged in to was getting
a touch congested as more and more of you guys came here. After
much consulting with friends, people in the know and very little
research I decided to take the plunge and build a couple of servers
instead of using a rented machine. Great idea. Simple in theory
- not quite that way in implementation.
The first thing we did was work
out what was required. It was decided early on that it was now time
to go from a single server to two. We could have built one kick
ass machine but I'd end needing a second one eventually so not much
point. Next we had to figure out what would need to be run where
- things like Apache, DNS, MySQL, PHP, mail and a whole heap of
other things all had to be taken into consideration. There was no
point setting them up to load balance eachother - too much screwing
around involved. This way, one machine handles particular tasks.
Itchy is primarily a just serving pages and stores all the html,
images and php that you guys see. Scratchy handles video downloads,
mail and MySQL stuff.
After that we compared the specs
of the current and previous servers, our home computers, how they
handled growth and traffic increases and what the future may hold
all in aid of getting a better idea of how much horse power was
required.
Whilst I didn't set a budget to work within,
I tried as hard as possible to keep costs down. How to do this with
out skimping was the hard part. I had a few suggestions that would
have been unreal such as SCSI RAID but when you consider that each
server would have needed SCSI/RAID cards and three SCSI drives each,
the cost of the whole project would almost have doubled. The other
thing we wanted to use was dual Xeon processors but once again cost
was prohibitive for what we were doing.
Anyways after much deliberation and constant
back and forward with Chris, Tim and myself we decided the following
would cut it [the below is for 2 machines]:
2 x Intel Pentium IV 2.4Ghz 478- Boxed with HS &
Fan; 512K NorthWood; 533FSB
1 x 52X LG Speed CD ROM Drive
4 x 40 Gb Western Digital 7200 RPM UATA 100; JB Model; 8Mb Cache
2 x D845PESV Intel D845PESV, Pentium 4 (533/667MHz) M/B for 333MHz
DDR 533 AGP4X ATX PC2700
2 x Panasonic 1.44 MB FDD Drive
2 x Intel Pro/100-S, Desktop Adapter with i82550EY Fast Ethernet
controller, 3DES 168Bit
4 x Direct PC 512MB PC2700 400MHz DDR RAM
2 x Sparkle PCI Nividia TNT-2 M64 Chipset
2 x 2u SVEC server cases w/ 300x power supply
2 x Server rails
2 x Promise FastTrack100 ATA100 PCI RAID 0,1 2-Channel Controller
OEM
So in a nutshell EACH machine is a P4 2.4Ghz
running on an Intel board with 1Gb of RAM, Promise RAID controllers
[RAID 0] across dual 40Gb Western Digital hard drives all sitting
in SVEC 2U cases with 300w power supplies. Not bad huh!?
A couple of people suggested I should have gone
with Athlon processors but from what I have seen and read I don't
think it was a viable option when you consider the servers were
going to be on the otherside of the world from me. I've always used
Pentiums and I've never had a problem with them - no need to change
that now.
Getting the machines built wasnt
too hard. Chris and I did it over a couple of days and didn't have
a drama with any of the hardware I bought. Everything fitted like
it was supposed to and Itchy and Scratchy were born.
Then came time to get an operating
system installed on them both. Piece of piss I thought - I've installed
Linux dozens of times. Note: Here's where I started to become frustrated.
Seems the RAID controllers I chose to use dont have good Linux driver
support.
After much fucking around and probably
15 attempts to get Red Hat, Debian and Mandrake installed, Tim waltzes
in and gets Slackware running in about an hour. Bastard. This ofcourse
was done after a couple of beers were consumed so if anything fucks
up really badly we know who to blame. Spent the next day transferring
all the site files off my home computer onto the servers. They're
more or less ready by this stage.
Next hurdle to over come was shipping.
Dropped them off on the Tuesday only to find out that they wont
be shipped until Friday [not impressed]. They finally exit Perth,
get shipped directly to Auckland in New Zealand, Los Angeles and
then onto Dallas in Texas. After a few hassles with paper work and
a mildly assertive email from myself the boxes were passed
off to US Customs where they sat until Monday. Wednesday Perth time
the servers have arrived at the colocation facility and are plugged
in. Success!
Itchy and Scratchy also see a big increase in
available bandwidth too. Up until now the site has run off a burstable
10Mb pipe meaning that the fastest the data can go through there
is 10Mb. The problem with that was that over the last few months
the site wasn't having too much trouble maxing out the link. We've
now got 30Mb between to two machines which means everything should
be nice and fast.
Anyways that about wraps up the story of how
everything came together. The next few weeks of you guys surfing
the site will tell the tale if it was all worth it I guess. It's
been one of the most stressfull months I've ever had as well. So
much shit I have had to learn and work out to make this go smoothly
I'm amazed that you guys are actually reading this page!
One last thing - HUGE thanks to Chris, Tim
and Chris M. Without their help this would never have happened.
Anyways, the next couple of pages are just progress
photos. Check em out... |