Happy π Day!!
Spotted this over on Of Zen and Computing, and I thought it bore repeating to gain some Google traffic.
If you have ever used G-Archiver to back up your GMail, immediately change your GMail password and uninstall G-Archiver. G-Archiver e-mails your GMail username and password to the program's author. The program's source code contains the author's GMail login credentials — an abnormal ocurrance [sic] that led a curious reader of Coding Horror to discover the dark side of G-Archiver. Coding Horror reader Dustin Brooks took a peak at the author’s GMail account, and discovered that John Terry, creator of G-Archiver, is in possession of the usernames and passwords of thousands of people who have used G-Archiver.
Quite a while ago, my wife left me alone at home for an evening with nothing to do. Arguably, I could have cleaned or something, but I was feeling crafty, and had happened upon a tutorial on Instructables for creating a Floppy Disk Pen Holder. I looked at the first page, totally grokked the idea, and whipped up two of these utilitarian beasts of burden.
I must admit that this is an almost perfect re-purposing of "things I was going to throw away some day". I've seen nifty uses for discarded recordable CDs and DVDs, as well as spindle containers. I usually recycle the latter and shred the former, but were I to ever actually require a bagel carrier, I know where I'd get one!
Speaking of re-use, I think I have five more floppies lying around somewhere, and should be able to construct a third pen cup at some point. It would have to be a gift since I have, at this point, max'd out my ratio of floppy-disk-pen-cups to desks.
This is helpful for all of those Linux users out there, but for those of us who have a Windows desktop with the same needs, the solution is a bit different. If you use PuTTY (or PuTTY Tray, like me), there's a field for Seconds between keepalives on the Connection configuration pane. Check out the image at right for the rockin' detail.
- Rebooting After Tough Crititcism (tags: management)
- Using Outlook to Get Things Done (tags: lifehacks, productivity, software notes, windows)
- Managing Burnout (tags: management)
Each one seemed like a good idea at the time. Maybe I'll come back to the idea some day when the mood strikes or the topic is more fresh in my mind.
About four months ago, I set out to fix a looming technology problem at home, having no solid backup strategy. After some arguably un-scientific research, I came up with a solution which has given me a decent amount of the ever-sought-after piece-of-mind. It looked something like this:
- I subscribed to JungleDisk using Amazon S3 storage. It backed up about 30Gb of data (in about 10 days! ouch!) and then ran in the early morning to keep itself in sync.
- I set up an rdiff-backup script to mirror the important stuff to an external USB drive.
- I created a subversion repository for all of my installers, tools, installation CD iso images (which I created using MagicISO)
- I copied my entire media library onto a cheap, huge external SATA drive and brought it to work, which is a more secure location that my house.
- I mirrored my svn repository onto that external drive as well and set up a batch script to update it weekly.
All in all, this was a really good first effort. I was happy and felt secure, though I haven't once had the need to use this wonderful system. I have, however, found some parts of it to be annoying. First of all, JungleDisk is SLOW. Really, really slow. Well, ok, it's only slow to upload. It took TEN days to upload about 30Gb of data, which is about 34kb/sec. As a test, I did some uploads to servers around the country (I have friends in fun places), and averaged about 100kb/sec, so in my opinion either Amazon throttles their incoming bandwidth, which I can understand, or my route to S3 stinks. In addition, it slows my computer down when it runs. I feel like that's really out of the question for a modern application.
These things are annoying, but they aren't a hill to die on. What was really bothering me was the cost. I had estimated that it would cost about the same each month to use S3 as it does to use mozy.com or other similar options. I was wrong! It costs $4.95 each month for mozy, and about $7 each month for S3 with my usage. It's not a huge difference, but it is annoying.
Back when I was looking at online storage, I tried mozy seriously and found the biggest weakness to be their Mac client, which I failed to note at the time was just freshly out of private beta. On a whim, I tried it again, still in beta but a tad newer. It still does that thing every time you start it up where it tries to scan your probable backup sets even though I don't want it to. It does not, however, keep crashing and it is, to my delight, a LOT faster. I got my averaged upload speed when I let it run untethered. And, even when I did that, my mac barely noticed it was doing anything major with its network connection. The client lets you throttle the bandwidth use during a specific time range, so I turned it down to about 48kb/sec when home in the evenings so that the wife doesn't notice that it's chugging away. (She definitely noticed with JungleDisk.. low WAF to be sure!) It, therefore, uploaded the same 30Gb of data in under FOUR days. I even got to tail its log file and watch it upload each file. Exciting stuff, I tell you.
BUT, to complicate matters in the mean time, I had started using svn to store my digital images as well, so all of those lovely
.svn directories were lying around, and there was no smart way to tell mozy to leave them alone. So, I devised a fairly straightforward workaround: I modified my rdiff-backup job to ignore these files and populated my mozy backup set with the rdiff'd backup set instead. I stagger the cron and mozy jobs to keep everything in sync, up to date and backed up. It works quite well, and I never have to look at it. Ever.
I do, of course, look at it every single day because I'm paranoid. I'd love to say I'll start trusting it to just keep working, but to be honest, I like knowing for certain that my data are safe in the event of a flood, fire, EMP, etc.
I also installed the windows client at work under a different e-mail account (I only want the 2Gb free service and you can't use both with the same account) and use it to back up my work documents. We use Acronis at work, but also use PGP whole-disk encryption and I just don't trust that my drive won't some day get itself all corrupted in a tizzy. Acronis fails semi-randomly and so far mozy doesn't, so I'm not going to take any chances with stuff that's important because, well, that's the whole point. Besides, mozy's windows client includes an awesome mapped drive that lets me browse right into my recent backup and grab files as needed. Seriously, when they add that feature to the mac client my backup life will be complete. That was, actually, one of the only things I really liked about JungleDisk.
So, in the end, I only changed direction slightly. I have yet to do any of the other things I needed to do for extra security, though I did lock my mac to my radiator pipes when I went away for two weeks. It made prudent sense at the time, though when I think about it now, it seems silly.
As the digg submitter said, "I'm betting on the camel".
This image got me thinking about how this is allegory for bringing new ideas to old organizations. In a thankfully distant past job, I found myself in exactly this situation, trying to innovate where it was nearly impossible to do so. It's hard when you come up against a camel - they're hard to move when they don't want to, but when they want to, they do, and good luck stopping them. It doesn't matter at all how shiny the car is that you happen to be driving.
My advice to those driving cars in this post's allegorical world is simple - find a different road. You're not going to get the camel to leave or move on your own. If there happens to be a camel-wrangler around, you may be in luck, but they have to agree with the direction in which you want to go.
Save that sadly rare scenario, if innovation and newness is important to you, go find some newness somewhere else. I wish I had followed that advice sooner!
During the backup project, I created a subversion repository for all of my installation media and installation files - anything I'd need to set up a new computer. The intention was to have a system that was a solid backup for these data, available for download from just about anywhere. Over the months, I've also added other stores to the repository, such as my useful windows standalone tools.
Recently, I thought it would be useful to have all of my digital photos in the repository, both for backup purposes as well as to allow me to, say, have all of my downloaded desktop wallpapers on my machines at work and at home. What better purpose, right? The biggest pain was keeping the repository up to date without constantly needing to issue pseudo-random
svn add commands every time we take pictures. (I could use scplugin to make this easier, but as I found out, it kinda sucks.)
Instead, I turned to my love for shell scripting, and fairly quickly banged this out. I used the handy
svn status command with impunity to help me handle recursive adds and deletes.
Of note, yes, I wish SyntaxHighlighter worked here...
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
It would be really nice if there were SyntaxHighlighter support for shell scripts. Maybe that's worth working on myself, eh? Updated 2009-01-12: Sha-Zam!
This also required some modification to my rdiff-backup cronjob since, well, I didn't want to back up every single .svn directory and included files as that would effectively double the backup requirements for that directory. So, I played a bit more with rdiff-backup's options and ended up changing from this:
This is obviously MUCH cleaner.
rdiff-backup-files.txt looks like this:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
That first line,
- /Users/shelton/**/.svn, is by far the most important, and took way too much trial and error to get right. Order of statements is, apparently, VERY important for rsync/rdiff-backup. Another note about the above list, I'm now backing up things that I wasn't before, namely my VM directory, which contains my Parallels virtual machines. It started to make prudent sense, though currently the only VM in there is a very tiny cfg fileset to virtual-boot my BootCamp partition. I've started doing several things differently with regard to my home backup, which I'll have to summarize at some point soon.
I will not let the Whiz Kid conduct research aboard my ship. If he's got a theory that he's itching to test, I will deposit him on an uninhabited planet in friendly space, and make sure that I'm out of the system before he's done unpacking.