DHSnapshot

A while ago I needed to find a good setup for backing up some machines.

I wanted to use RSync to do snapshots-like backups.

This setup is:
* Off-site
* Storage efficient (the bulk of my backup was unlikely to ever change)
* Network efficient (only transfer changed files)
* Keeps versions going back a few months

Now, DreamHost offers 50GB of free space for personal backups and that seemed a good fit for this particular situation.

Then I found RSnapshot and it was exactly what I needed.
Unfortunately, RSnapshot doesn't backup to an external server.
It's meant to work in a setup where the server hosting the backups connects to your data sources and pull the files.
That totally makes sense and is probably the best setup but it wouldn't work for me.

DreamHost doesn't have RSnapshot installed on their backup servers and the only access users have to that machine is SFTP and RSync.
I needed something that would work with just that.

I came across a post about backing up DreamHost websites to DreamHost Backups that showed a workaround to the lack of SSH access to the backups service.

So I took the idea, mixed it up with the stuff I wanted from RSnapshot and wrote a small perl script to do it.

It's called dhsnapshot and is published at GitHub: http://github.com/carloslima/dhsnapshot

At this moment, it's not very flexible: it's hardwired to keep 7 daily, 4 weekly and 6 monthly backups and it's also limited to a single backup source (you can only point it to one source directory)
But it's not hard to change, or even make it configurable.
I might do it if I ever get the need or motivation :)

One way or the other, it should be reasonably simple to setup (instructions on GitHub).

I'd be glad to know if it helped anyone.
So, please, drop me a comment if you fnd it useful.

7 Comments:

  1. Marcus Friedman | ellipsys... said...
    Carlos, I'd like to thank you for sharing your wonderful script with the community.

    I'm quite impressed by the simplicity of your code, which I think fits perfectly within one of the principles of the Unix philosophy: "Small is beautiful". It's really nice to see how cleverly you've leveraged the available tools in order to accomplish your goal.

    Congratulations, and keep up the good work!

    Best regards,
    Marcus
    Kirkland said...
    Hi Carlos,

    This works great. I might be forced to learn enough perl to be able to backup several directories at once! :)

    Rob
    Carlos Lima said...
    I'm surprised this post was read and happy that it was helpful!
    I haven't checked here in a long while and wasn't expecting any comment.
    Thanks for passing by and for dropping me a line.

    @Marcus: I'm happy it was helpful and happier for reading the feedback. Thanks for the kind words.

    @Kirkland/Rob: if you do, would you consider sharing it back either as a fork on github or somewhere else? I'd be interested on it :)
    Chad said...
    Carlos, this script is genius! I was in the middle of setting up a backup system to backup my ubuntu home folder. I'd gotten rsync set up and had set up my keypair, and was just about to make a cronjob, when I found a link to this in on of the DH forums, and this is much better than what I was going to have. The snapshots feature is brilliant, and the whole script is simple and just works. Your instructions were very easy to follow as well. Thanks!
    Anonymous said...
    Great script, thanks. One suggestion: in your instructions you set permission to 400 for the .conf file. Well, it's not enough and perl will complain:

    Can't locate dhsnapshot.conf in @INC (@INC contains: /etc/perl /usr/local/lib/perl/5.10.0 /usr/local/share/perl/5.10.0 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.10 /usr/share/perl/5.10 /usr/local/lib/site_perl .) at /home/xxx/backup/dhsnapshot.pl line 9.

    I have it set to 755 and it works now.
    Suman said...
    Thanks a bunch Carlos. This works perfectly! :)
    Júda Ronén said...
    Thank you so much for this script. It is so elegant…

Post a Comment



Newer Post Older Post Home

Blogger Template by Blogcrowds.