Geektastic project du jour
Moving into a new (and somewhat smaller) apartment in the city and sharing that space with another person (for the first time) has got me thinking about some important things.
Like our home network.
The first few nights I started having fantasies about replacing the two inkjet printers we brought to this new space with a spiffy new networked color laser. Later I realized I almost never print. So we settled on keeping one of the inkjets around and plugged it into Stephanie’s desktop.
I was also thinking about augmenting my cable modem wireless access point combo with another wireless access point, the Linksys WRT54GL. You know, the one that you can install your own firmware on. To do what with you ask? I don’t know, whatever I want. I was considering this because the wireless on the combo device was dropping connections and disappearing for minutes at a time (much like the wireless access point it replaced). But since moving to the new apartment there hasn’t been a problem. Maybe the signal was being disturbed an errant cordless phone in my previous building?
Lastly I was thinking about getting a network attached storage drive so Stephanie and I would have a place to store, backup, and share files. But the reviews for those devices weren’t winning me over. And they’re not cheap.
Tonight I finally put 2 and 2 together and realized, I’ve got a Mini-ITX box laying around unused, and I’ve got an external 160GB USB hard drive doing the same. Put together that’s a fileserver if I ever saw one. I figure, why not just install the server edition of Ubuntu on the Mini-ITX, plug in the external hard drive, and shove it all under a bookcase.
If only things were that simple. I downloaded and burned a CD with the server edition, installed it (not a blazing fast process on the 600 MHz ME6000), rebooted, and it rebooted again, and again, and again, ad infinitum. Turns out I’m not alone. Apparently Ubuntu Server 6.06.1 does not boot on VIA EPIA ME6000. Ah well that’s nice. My luck with getting an OS on this system has not been that successful. Luckily someone somewhere out there on the intarweb suggested using the alternate install CD because of the problems with the server edition on the Via EPIA. Okie dokey, let’s give that a whirl.
Woo. We have a server set up. The best part is, it’s a full blown, general purpose computer, unlimited by flash memory size or firmware specialization. I can do anything I want.
apt-get install openssh-server apt-get install samba
Followed the Ubuntu Guide’s How to share group folders with read/write permissions (Authentication=Yes) and Bam!, we’ve got a fileserver.
F*ckin’ nerd…
Okay, okay. So I’m green with envy.
The best part is, I can punch a hole in my firewall and send all ssh traffic to the box to access files from outside the network.
I have a similar via EPIA board. I should try this out. So you were able to get regular Ubuntu to run with it?
My problem is I think too damn big with these projects? I want a RAID 5 1 TB file server. Cheaper than it used to be but not too cheap. Reason? So I can clone an entire filesystem and make incremental backups. That way if something goes really wrong I just reimage the whole damn thing. :)
I do the same. I was ready to buy a new printer, wireless access point, NAS, etc. ($700+) Then I realized, I don’t really need any of it, and even better, what I already have (but wasn’t using) is way more general purpose and thus flexible for my whims.
Automating backups are definitely phase II of this project. However, both Stephanie and I run Windows on our respective computers, so I’m in search of some way to create a background process that transparently and periodically copies files from our machines across the network. Any ideas?
One suggestion I’ve heard is not to install anything on our Windows boxes, but rather configure our My Documents folders on Windows to be shareable, and then set up smbfs on the MiniITX with a cron job that periodically fetches the files from our machines and stores them on the external USB drive.
And yeah, once I installed Ubuntu using the alternate-install CD, I was off to the races. No problems at all.
I assume you’ve read Mark Pilgrim’s post about data backups? Thankfully I don’t have any digital video, so I can pretty much store all my digital detritus on one hard drive, but as it stands all my data currently exists only on that one laptop (minus any data hosted online: blog, photos, etc) which is a pretty big single point of failure. I would sleep easier knowing that data existed on at least one other hard drive.
For backing up windows machines i’d store all your important stuff on the file server like you said. But to back up your OS and Apps I’d clone the entire system partition. Might be a good idea to create two partitions on your Windows boxes. A system partition and a data partition. OS and Apps on system and data on data. :)
I used a free open source clone app called g4u when I worked as a tech in AmeriCorp. It helped me clone over 30 desktop PCs and 27 laptops.
G4U can copy and restore an entire file system across a network with ftp. Its uses a boot floppy or CD-R to boot the box you want to clone. Created with FreeBSD. Not sure if it does incremental backups. You can always just backup your My Documents folder or make new whole backups and delete the old ones. :)
An off site backup solution I’ve heard is good is called Iron Mountain. Hundreds of computers here on campus are using this service. You can get yearly subscriptions for your windows box for under $100.
Now if I can just get off my ass and create local backups of all our computers AND do off site backups too. Not to mention all the dbases and website code I have on webservers. Ugh!
It’s funny, I feel like I could live without a backup of my apps and their settings. Granted in the event of a crash, it would be nice to just restart everything from where I left off, but having moved between several personal and work machines and two different operating systems in the last few years, I’ve become less and less attached to the applications and configuration settings I’ve accreted on any given machine.
When I start with a new computer, I enjoy the zen-like simplicity of starting fresh and allowing actual need to motivate me to reinstall and reconfigure things.
That said, when I mentioned wanting to backup My Documents, I really meant everything under C:\Documents and Settings\jwatt on Windows, with includes My Documents, My Desktop, and scads of application settings. I’m not sure how well the situation fares on Mac OSX.
The only issue I’d have with G4U for personal use (if I understand it correctly) is that it requires me getting off my ass and manually cloning a hard drive (i.e. running a backup) with a bootable CD—which I’ve already demonstrated that I probably won’t do. I think a more successful backup would be completely transparent, and the more I think about it, the more I like the idea of the backup server itself being responsible for grabbing and backing up files for several machines, instead of configuring each individual computer to send files to the backup server.
Here’s an example backup server setup like I’m envisioning—though it’s a little more complex than my needs: Setting up a Backup Server
More links: Easy Automated Snapshot-Style Backups with Linux and Rsync and Snapshot backup using rsync and ssh
If you’re both using Windows on the laptops, is it possible to just setup SyncToy to automatically copy the appropriate folders across the network every night?
Jason, thanks for the suggestion, I hadn’t heard of SyncToy before.
Chatted about the project with Kyle Rankin this weekend and he suggested checking out BackupPC—which sounds like exactly what I want. Apparently he uses it where he works with great success.
Oh yeah and here’s the best part:
Looks like I’ve got a project tonight.
Backups are so hot right now: Jeremy Zawodny on A List of Amazon S3 Backup Tools.
Shelley Powers reminds me that my $10/month Dreamhost account, the home of justinsomnia.org and editplus.info, provides 200GB of storage and 2TB of bandwidth a month. But I’m only using 2.5GB and 17GB respectively. Or 1.25% and 0.58%!
That’s a pretty strong argument in favor of Dreamhost for off-site backup, considering I’m already paying for it.
By comparison, using Amazon’s S3 service to the maximum of what Dreamhost allows would cost $430 per month. Dreamhost = $420/month savings. Plus I get shell access. And a webserver. And a database server…
$0.15/GB stored * 200GB = $30
$0.20/GB transfers * 2000GB = $400
Update: I changed by disk usage number above. Turns out something is broken with Dreamhost’s reporting tool, and it was only showing my database disk usage (500MB). I’ve got another 2GB of files on top of that.
I’ve written more about using Ubuntu as a backup server here: How to regularly backup Windows XP to Ubuntu, using rsync