Keeping work and personal computers separated

I try as much as possible to keep my personal stuff separated from the work stuff. Even if both California and Utah laws are clear that what I develop on my own time and my own hardware is mine (as long as it is not related to my day job – that’s the big difference with French laws where I own what I developed on my own time, even if it is on my employer’s business or computers), that did not prevent one of my former employers to try to claim ownership on what was rightfully mine. Because it is very expensive to get justice in the USA, getting things as separated as possible from the beginning seems like a good idea.

The best way to do that is simply to not work during one’s free time on anything that could have a potential business value – these days, I spend a lot of time learning about cryptography, control system engineering and concurrent systems validation. But keeping things separated still creates some issues, like having to carry two laptops when traveling. I did this twice for IETF meetings, and it is really no fun.

The solution I finally found was to run my personal laptop as an encrypted hard drive in a virtual machine on the company laptop. My employer provided me with a MacBook, which has nice hardware but whose OS is not very good. I had to put a reminder in my calendar to reboot it each week if I did not want to see it regularly crashing or freezing. Mac OSX is a lot like like Windows, excepted that you are not ashamed to show it to your friends. Anyway here’s how to run your own personal computer on your employer’s laptop:

First you need a portable hard drive, preferably one that does not require a power supply. I use the My Passport Ultra 500GB with the AmazonBasics Hard Carrying Case. Next step is to install and configure VirtualBox on your laptop. You will need to install the Oracle VM VirtualBox Extension Pack if, like me, you need to use in your personal computer a USB device that is connected to the employer laptop (in my case, a smart-card dongle that contains the TLS private key to connect to my servers). Next step is to change the owner of your hard drive (you unfortunately will have to do that each time you plug the hard drive):

sudo chown <user> /dev/disk2

After this you can create a raw vdmk file that will reference the external hard drive:

cd VirtualBox VMs
VBoxManage internalcommands createrawvmdk -filename ExternalDisk.vmdk -rawdisk /dev/disk2

After this, you just have to create a VM in VirtualBox that is using this vdmk file. I installed Debian sid which encryption, which took the most part of the day as the whole disk as to be encrypted sector by sector. I also installed gogo6 so I could have an IPv6 connection in places that still live in the dark age. Debian contains the correct packages (apt-get install virtualbox-guest-utils) so the X server in the personal computer will adapt its display size automatically to the size of the laptop.

To restore the data from my desktop, I configured VirtualBox on it too, so I could also run the personal computer on it. Then, thanks to the same Debian packages, I was able to mount my backups as a shared folder and restore all my data in far less time than an scp command would take.

And after all of this I had a secure and convenient way to handle my personal emails without having to carry two laptops.

Things I learned last week (5)

IPv6 tunnel

Some of the time spent last week was to prepare for the IETF meeting in Berlin. I explained previously that I use a secure IPv6 connection to my mail server to send and retrieve emails, which creates its own set of issues when travelling. If on one hand the IETF provides a great IPv6 network on site, there is very little hope to find something similar on the various places one has to stay on his way to and from this event. E.g. hotels and airports generally are not IPv6 enabled, so a tunnel solution is required. I had a very good experience with Hurricane Electric before Comcast went IPv6 native, but their technology does not help much when the laptop is behind a NAT that cannot be controlled. So in this case I use the service provided by gogo6 (aka freenet6). I use their gogoCPE behind my AT&T NAT and their software client on my laptop, at least since last week when I finally found the solution to the problem I had configuring it. Probably because I was using a WiFi connection instead on the wired connection, the gogoc daemon got stuck until I ran the following command and answered the prompt:

sudo /usr/sbin/gogoc -n -s wlan0

Backup restoration

A full backup is obviously useful when something terribly wrong happen to your disk (in the 90’s I lost a disk full of code, and as I define stupidity as doing the same mistake twice, I am very careful since to always have multiple levels of protection for my code), but having it helps also in the day to day tasks, like for example when some code modification went into a wrong direction so restoring the previous day backup saves multiple hours of work.

Another benefit I discovered some time ago is to prepare my laptop before a travel. I like to carry my whole email and development history with me, so that’s a lot of data to copy from one computer to another. Initially I created a giant tarball on a flash drive, and then uncompressed it on the target computer, but that took forever. Now I just reuse my backup. On the day before I leave, I restore on my laptop the directories I need directly from the last full backup (i.e. from the last Sunday). The improvement I made last week is that I then change the configuration of my mailer so the emails are no longer deleted from my mail server. Then during the night, the incremental backup saves my new emails and the new configuration and it takes then less than 5 minutes before leaving for the airport the next day to restore the incremental backup, and with the guaranty that during my trip all my emails will stay on the server for when I am back to my office. That means less wasted time, and less stress.

Fireproof backup: another update

A little bit more than one year after replacing it, my backup disk was starting to show signs of weakness, so it was time for another replacement. This time, I was able to find a 1.5 Tb disk that does not require an external power supply. With this disk size I will be able to keep 2 weeks of backup and fully automatize the removal of old backups (it’s currently a manual process).

But this disk is so new that it is not really well supported by Linux. The first step was to update the /usr/share/smartmontools/drivedb.h file with the following entry, so smartmontools can recognize it:

{ "USB: Seagate FreeAgent Go Flex USB 3.0; ",
"0x0bc2:0x5031",
"",
"",
"-d sat"
},

Unfortunately hdparm does not seem to recognize it either, but it found a way to test the standby status by using smartclt instead (I need this to be sure that the disk is really in standby mode before opening the safe – see my previous posts on this subject for the reason):

$ sudo smartctl -n standby /dev/sde
smartctl 5.41 2011-03-16 r3296 [x86_64-unknown-linux-gnu-2.6.38-2-amd64] (local build)
Copyright (C) 2002-11 by Bruce Allen, http://smartmontools.sourceforge.net

Device is in STANDBY mode, exit(2)

It seems that this disk is a little bit slower than the previous one – a full backup takes ~9 hours now, where with the previous disk it took a little bit more than 7 hours. The new disk is supporting USB 3.0, it’s a shame I cannot use it.

Last minor issue, there is a bug in the kernel, so each time the disk is used a spurious log entry appears in syslog:

[171412.410641] sd 16:0:0:0: [sde] Sense Key : Recovered Error [current] [descriptor]
[171412.418494] Descriptor sense data with sense descriptors (in hex):
[171412.425309] 72 01 04 1d 00 00 00 0e 09 0c 00 00 00 00 00 00
[171412.432089] 00 00 00 00 40 50
[171412.436059] sd 16:0:0:0: [sde] ASC=0x4 ASCQ=0x1d

It seems that there is a patch available for this problem, but I do not see it applied in the latest kernel yet.

Update 05/22/2011:

A full backup with USB 3.0 takes only 13% less time than with USB 2.0. So more or less the same speed that I had before changing the disk. It also seems that the USB 3.0 driver is not complete as I had to force -d sat,12 to use smartctl, and the disk never goes to standby mode. And as expected, Linux 2.6.39 does not fix the sense data bug.

Update 11/19/2011:

Problem finally fixed in Linux 3.1.

Update 07/15/2013:

Fixed link to previous post on this subject.

Fireproof backup: an update

One year ago I posted a blog entry about my fireproof backup. The backup is still in place today, and prevented the loss of multiple days of work, but there was some occasional problems at the beginning that could be interesting to share.

The first disk I used died after something like 3 months, probably because I opened the safe during a backup (the disk is located in the safe’s door). But one’s could have expected more from a disk supposed to be “ultra portable”. Anyway, I bough a second disk, but there was bad sectors which created problems for the first few weeks, until I reformatted it with “mkfs.ext3 -c”. I am still removing the the old backups file manually each week before the full backup, and for good measure I do also a “fsck -f” before removing the files on the disk (even if it takes more time, I learned the hard way that the fsck should be done *before* removing the files, in case fsck decide that the most recent files are corrupted…)

As for the problem that opening the safe door during the backup is not a good idea, I tried to find a way to have a visual clue on the safe itself that the disk is currently used, but without success. Something as simple as a male/female dongle that could be inserted in the USB cable, with an LED that is on when the current drained on the +5V line is above some threshold would be perfect. Another project to put on the stack…

I also would like to use a larger disk – 1.5 Tb would be perfect, as I could keep two full backups, plus two weeks of incremental backup. Currently I have to be careful and, for example, synchronize and rebuild the complete Android tree only on the day before the full backup. Unfortunately I was not able yet to find a disk powered by the USB connector that can have this capacity (but removing all the ogg files from my desktop computer, as explained in the previous blog entry, helped keeping this under control for now).

I also tried to convert my setting to USB 3.0 for accelerate the backup (it currently takes between 8 and 15 hours) – going as far as installing a USB 3.0 card in my desktop – until I realized that I cannot change the USB cable that goes inside the safe without compromising the waterproof seal.

And talking about waterproof seal, one of the problem of the safe itself is that humidity quickly build up inside the safe (probably exacerbated by the heat of the disk), so after some time the documents stored feel kind of damp. I bought a dehumidifier that did solve the problem. I just have to plug it overnight each three months or so to renew the crystals.

Fireproof backups

After the last three earthquakes in the Bay Area, I started thinking that developing code without backup in an house made entirely of wood was probably not the best idea in the world. I work from home since a long time but until last year I always committed my code in the central repository of my employer, so it was not really an issue. I use RAID10 for my development computer so losing a disk is not a problem, but I would still loose all the code I wrote during the last year if the house burns down.

I basically never delete anything – when my disks are full I just buy bigger disks and copy everything on them. The consequence is that I have currently 572Gb to backup, so that excludes doing remote backup – I used rsync.net in the past and they have a terrific service, but uploading 300Gb of compressed data is not an option here – and I do not want to save only a subset of the disks.

Doing a backup on an external disk was the only solution remaining, but the problem is still that if the house burn then the backup disk will also burn. I could have rented a safe box at my bank but having to exchange the disks each week was too much of a burden. The solution I choose was to put the backup disk in a fireproof safe installed in the house. Now the next problem is that the house can burn when I am doing the backup – which will be done each night, so that’s more or less eight hours each day with an unprotected backup. One neat solution I found was a disk that was directly encased in a fireproof safe. The big advantage of this disk is that it is protected even during the backup. The problem is that I also wanted to be able to store my passport and other documents in it. So I found the perfect solution: A USB fireproof safe. I can now put a 1TB disk inside the safe, and use it without having to open and close the safe.

The next step was the backup itself. I wanted to use good old dump/restore but with only one partition containing everything I was not able to unmount it to do the backup and doing a backup on a live partition – even during the night – is not a good idea. And anyway with a full backup taking ten hours to complete that was not an option. The best solution I found was to use a LVM snapshot – you just create a snapshot of the partition to backup, then you backup the snapshot and can continue to use the main partition. Unfortunately I did not had LVM installed so I had to copy the whole partition on an external disk, change the partitions to support LVM and then copy back the whole external disk – it took twenty-four hours to do this, but it was worth it.

The last step was to install dump and configure it to do a full backup each Sunday and an incremental backup each other night. I used some of the scripts delivered in the dump package, with some modifications to adjust them to my needs.

Now I receive an email each morning with the result of the backup. The last thing remaining to do will be to try a full restore.

Updated 01/10/2011