The Data Backup Thread (& request for more suggestions)

Another recommendation for Crashplan+. In addition to it being a great product I've found they also have good customer service, which is really important to me since I'm not the tech genius many of you are.

Right now I've got the individual plan to back up my MBP, but I'm considering switching to the family plan so that I can also back up my gaming PC.

ibdoomed wrote:

For SOHO use, I'd suggest the DS411j for under $400 and the biggest disks you can afford.

http://www.amazon.com/Synology-DiskS...

These 2tb are on the compatibility list and only ~$100 each:
www.amazon.com/Seagate-ST2000DM001-B...

You could start with 2 and have 4tb raw for about $600 and eventually get to 16tb raw if you need that much.

I followed this suggestion and the setup couldn't have been easier. The system has a nice web based interface that was easy to use and even comes with an interface ready for display on a mobile device.

By default it went into a raid setup for the 2 drives so I'd have to change that to get the full 4tb but for now that's going to work for me.

Thanks for the suggestion.

I checked to see if there was another thread that matched what I want but I saw this so sort of thought that maybe it was close enough.
on to the question!

I need to back up several websites.
Preferably by FTP to a zipped archive. I see tons of options out there but I wanted to see what others are using. I mainly work with Wordpress and I know there are backup plugins but I'd prefer to backup from the root directory and everything in it.

Thoughts?

You may be able to use rsync or a derivative. I use (for some definition of "use") rdiff-backup as detailed in the October 2010 issue of Linux Journal. I don't think it's as fiddly as the article suggests. The writing could've been a bit clearer. It does not itself zip and FTP, but perhaps incremental backup over rsync would suffice.

I have only backed up a couple of times, and have not restored from such backup, so this is tantamount to having not backed up at all. However, seems like it might work for you.

groan wrote:

I checked to see if there was another thread that matched what I want but I saw this so sort of thought that maybe it was close enough.
on to the question!

I need to back up several websites.
Preferably by FTP to a zipped archive. I see tons of options out there but I wanted to see what others are using. I mainly work with Wordpress and I know there are backup plugins but I'd prefer to backup from the root directory and everything in it.

Thoughts?

For simplicity's sake and a bit of added security, I tend to use rsync with scp as the transfer protocol. If I needed to zip up the results, I'd throw in a command which would execute after the rsync is complete.

Oh, Also i'm a systems newb. I can work myself around a command line as long as there's instruction.

A GUI is best for me

I'll look at your link though, but ideally a push button or scheduled backup utility is what I'm looking for. I'm asking here to see if anyone has experience with any.

groan wrote:

Oh, Also i'm a systems newb. I can work myself around a command line as long as there's instruction.

A GUI is best for me

I'll look at your link though, but ideally a push button or scheduled backup utility is what I'm looking for. I'm asking here to see if anyone has experience with any.

How about grsync, then?

Oh, one note: the "--delete" option is dangerous. You should leave it out of your script until you're sure it's really working. It means to delete everything in the target that's not in the source, so if you somehow get your paths confused, you can wipe out everything in your target dir. Make sure your syncs are copying files exactly where you want them before adding --delete.

If you're backing up just one machine, one directory, and don't use ssh, you could use this script, if you're on a Linux machine locally:

#!/bin/bash mkdir -p machinename /usr/bin/rsync -aHAX --delete [email protected]:/backupsource/ ./machinename/backupsource

ie, just three lines: one to say that it's a bash script, one to silently create a target directory (saying nothing if one already exists), and one to synchronize the files.

If you want to get more complex, this is a censored version of the script I use to back up the critical bits of my various Linux boxes. I'm backing up /etc and /home; you'll probably want to change those to something else.

#!/bin/bash eval $(ssh-agent) ssh-add for i in $(cat list_of_machines); do mkdir -p $i /usr/bin/rsync -aHAX --delete root@$i.example.com:/etc/ ./$i/etc /usr/bin/rsync -aHAX --delete root@$i.example.com:/home/ ./$i/home done kill $SSH_AGENT_PID

So, this script depends on another file, called "list_of_machines", which should have the hostnames of the machines to back up, one per line. So, say you have machine1.example.com, machine2.example.com, and machine3.example.com, the file would look like this:

machine1 machine2 machine3

So it loops through that file, reading the name off each line, and then runs an rsync to that machine, copying its directories locally. I have two commands in the file, one for "/etc" and one for "/home". (which is enough for to me recreate the machines from a skeleton install; everything that differentiates the machine from a vanilla install is always in those two directories.)

The 'ssh-agent', 'ssh-add' and 'kill' lines are to load a key into the agent; this requires that I be there when the script runs, to type in the password. This means that I don't have to type a password for each machine... however many there are, they just use the key over and over, I only have to type the password one time. (which means I can't automatically run it; I have to be there to do the typing.) Then it kills the agent again, so it's not sitting there holding my key. But you need to have public-key logins enabled on your servers for that to work.

Somewhere in between those two extremes, the dedicated line to back up one specific machine to one specific place, and the more general one to back up multiple machines at the same time using ssh-agent to prevent tons of typing, you should be able to make yourself a script that covers what you need.

groan wrote:

I checked to see if there was another thread that matched what I want but I saw this so sort of thought that maybe it was close enough.
on to the question!

I need to back up several websites.
Preferably by FTP to a zipped archive. I see tons of options out there but I wanted to see what others are using. I mainly work with Wordpress and I know there are backup plugins but I'd prefer to backup from the root directory and everything in it.

Thoughts?

Since you specifically mentioned backing up Wordpress, you might want to try out UpdraftPlus. It's a Wordpress plugin that backs up your themes, plugins, uploads and DB to FTP, Dropbox,Google Drive or S3. For added security, you can encrypt the DB backup. It doesn't backup Wordpress core files which could be a problem if you have hacked the core files.

I use it on my Wordpress install and it works really well - also the developer is very responsive to support queries in the WP.org forums which is a big plus in my book.

Thanks all,
I should note that these are hosted, I do have SSH access but I don't have root.

avggeek, I'll check out that plugin. If i need to I'll use the plugin to backup to S3 and just copy out the other stuff outside of the WP files.

Ideally I'd like something that backs up the /PublicHTML folder but I'll get over it

I appreciate al the command line stuff, but part of my problem is that I lack the discipline to make myself go back up stuff. I need it to be scheduled. I lost 5 or more sites last month due to my lack of preparedness and I'm trying to make amends.

groan wrote:

I need to back up several websites.
Preferably by FTP to a zipped archive. I see tons of options out there but I wanted to see what others are using. I mainly work with Wordpress and I know there are backup plugins but I'd prefer to backup from the root directory and everything in it.

Thoughts?

First, it's much more important that you backup the database, which you won't get by zipping up the Wordpress root directory. So, before worrying about that step, make sure you've got a solution for taking database snapshots.

For the few WP blogs we still maintain at work, we're using BackWPUp, which has the ability to run scheduled backups of the whole site file tree as well as the database. There are actually a number of different things you can have backed up. You can schedule backups and have it upload via S3, FTP, etc.

Any have good advice on back up software for use on a wireless network? I've been trying to back up the C drive with SynchBack Free, but it keeps failing at the end of the initial 12 hour setup. if it cannot get everything written over this time, I'll have to find another free program or something reasonably priced. Windows 7 ultimate is a bit pricey for me. Well, if anyone knows of any programs prone to success with wireless data transfer and ones that only update files with changes, please let me know.

Where are you backing up to? If it's locally do you have the option of just copying over the files to do the first run?

Scratched wrote:

Where are you backing up to? If it's locally do you have the option of just copying over the files to do the first run?

This. I would try using robocopy/synctoy to back up your data off the C, you can always reinstall the OS. Depending on installed software it could be quicker to do a full HD backup and if it's a small enough office I would go with Windows Home Server it will back up all the drives on all the workstations (windows xp => win8) and you can do file level restores and bare metal (make sure you have the NIC drivers somewhere). WHS will only do 10 machines though.

Rahmen wrote:
ibdoomed wrote:

For SOHO use, I'd suggest the DS411j for under $400 and the biggest disks you can afford.

http://www.amazon.com/Synology-DiskS...

These 2tb are on the compatibility list and only ~$100 each:
www.amazon.com/Seagate-ST2000DM001-B...

You could start with 2 and have 4tb raw for about $600 and eventually get to 16tb raw if you need that much.

I followed this suggestion and the setup couldn't have been easier. The system has a nice web based interface that was easy to use and even comes with an interface ready for display on a mobile device.

By default it went into a raid setup for the 2 drives so I'd have to change that to get the full 4tb but for now that's going to work for me.

Thanks for the suggestion.

The linked-to Synology is $1,200, but there are other options. This smaller Synology seems to receive high ratings, and there's mention in another, slightly older thread about it.

Thoughts? Will this tend to be less-expensive and more reliable/useful than getting a mini-ITX box, minimum PSU and mainboard, fan, and a small system harddrive and running that headless?

EDIT: Maybe a Raspberry Pi and a big USB drive would work?

For PresDay NewEgg has a Thecus N5550 Diskless System NAS Server bundled with (2) Western Digital Red WD20EFRX 2TB for $589. Not sure if that's a great deal, though I did just drop $450 for 3 more 3TB drives.

That is a bit of storage.

One of the nice things about the units like the Synology is that they're really just Linux boxes, but with the complexity mostly hidden under a reasonable web interface. The advantage is that it's largely an appliance at that point; the disadvantage is that it's an appliance, and it's harder to bend to your will. You can still modify one pretty heavily, but mostly by using packages that people have created, rather than by just using it like a normal Linux machine.

If you've got a reasonable amount of hardware and Linux knowledge, or if you're interested in taking the time to develop them, I tend to think that a DIY box will be cheaper, faster, and much more flexible. But if you're not already at least a junior-grade admin, and you're not interested in learning to be one, then the Synology boxes are good ways to take advantage of many of the strengths of open source, without destroying your pocketbook. It's a little-thought and no-mistakes solution, where doing it yourself can either be a ton of fun, or an ordeal, based largely on whether you enjoy tinkering.

Oh, and a Pi driving a USB hard drive would be a reasonable poor-man's NAS, but it probably won't be able to saturate a Fast Ethernet (100Mbit) connection, and won't even come close to filling a gigabit. If all you're doing is some pictures and music streaming, that would be fast enough. But, as an additional problem, backing up USB hard drives really sucks horribly. It takes bloody forever.

I think you'd be better off going a little further upscale, and getting at least a machine with two SATA connections, and two identical drives. But don't put them in a RAID; rather, build them as separate filesystems, and back one up to the other on a daily or weekly basis.

RAID is to protect against downtime from drive failure; it does not substitute for a backup. Backups are much more important than saving yourself downtime, especially at home.

Malor wrote:

One of the nice things about the units like the Synology is that they're really just Linux boxes, but with the complexity mostly hidden under a reasonable web interface. The advantage is that it's largely an appliance at that point; the disadvantage is that it's an appliance, and it's harder to bend to your will. You can still modify one pretty heavily, but mostly by using packages that people have created, rather than by just using it like a normal Linux machine.

If you've got a reasonable amount of hardware and Linux knowledge, or if you're interested in taking the time to develop them, I tend to think that a DIY box will be cheaper, faster, and much more flexible. But if you're not already at least a junior-grade admin, and you're not interested in learning to be one, then the Synology boxes are good ways to take advantage of many of the strengths of open source, without destroying your pocketbook. It's a little-thought and no-mistakes solution, where doing it yourself can either be a ton of fun, or an ordeal, based largely on whether you enjoy tinkering.

I'm pretty comfortable running Linux. I'm six or seven years into Slackware. I'm competent with most of the basics, prefer the command line, and so I think rolling my own would allow much more flexibility. However, pricing out a mini-ITX setup get's pretty close to if not exceeds the price of the Synology. A case for, say, $40, a mobo for $60-80, a system drive for $50, a PSU for $30, and RAM for $30 puts me over $200. I'd love to come in under that, but I'm starting to think that's irrational. Grabbing an old system off CL sounds good until you have to sort out what kinda networking it's got on-board, maybe it's IDE, maybe it could use more RAM but it's using old, now-more-expensive RAM, etc. I'm thinking ~$200 is a good budget to start with.

Thanks for the notes.

I've been running Synology since it was recommended to me in this thread months ago. So far it's been great.

I finally got my Synology NAS set up so that my desktop, laptop, and linux virtual dev server are all backing up to it via CrashPlan. I haven't signed up for CrashPlan+ yet. I will soon, though. Only thing I'm backing up but not using CrashPlan for is my music. That's just an rsync so that they mirror each other because I don't care about tracking changes.

Here's my question, though. I've seen in this thread that your backups should be set so that if you have to do a restore, your system is back to where it was as quickly as possible. Does that include programs? Because as long as my data is preserved, reinstalling software as I need it isn't a big deal to me. And not a bad way to clean house a bit, anyway.

"Restore as quickly as possible' is advice for system administrators, who need to get people productive again after disasters; every minute an employee is without a computer costs the company money.

For your own use, you make those decisions yourself. If you don't care about a speedy restore, then you don't care. As long as you're backing up what you think is important, and you have an idea of how you'll get back to something approaching normal, then you're fine. Backups are important, probably critically important, but their contents can only be determined by you.

Thanks! Good to know. I've got an SSD coming in the mail. Figured this'd be a good chance to test getting it up and running with data restored to it seeing as that my current system drive is just fine and if I missed something important in the backup, I still have the healthy drive to pull it from.

Something to note if you've got a Crucial M4 SSD, they've just put out a new firmware update 070H, for which they recommend doing a backup before flashing: http://www.crucial.com/uk/support/fi...

Resolved a power-up timing issue that could result in a drive hang, resulting in an inability to communicate with the host computer. The hang condition would typically occur during power-up or resume from Sleep or Hibernate. Most often, a new power cycle will clear the condition and allow normal operations to continue. The failure mode has only been observed in factory test. The failure mode is believed to have been contained to the factory. This fix is being implemented for all new builds, for all form factors, as a precautionary measure. The fix may be implemented in the field, as desired, to prevent occurrence of this boot-time failure. To date, no known field returns have been shown to be related to this issue. A failure of this type would typically be recoverable by a system reset.

That's a smart way to do it -- whenever playing with extreme system makeovers, always have a giant safety net if possible.

I'd suggest disconnecting the old drive during installation completely. It's fairly to get confused and erase the wrong drive. Connect ONLY the new drive during your OS reinstall and data restore, and then if you're missing something, then and only then reconnect the old drive, and only briefly, just long enough to get your data.

Once you're really sure that your restore is complete, and that you're fully functional, then you can reattach the old drive, and use it for data storage.

Using this method will also help show you whether or not your current backup system and recovery plan actually works. Testing your restore process is a very good idea, so this is giving you both a nice shiny new SSD, and some restore practice.

Nevermind.

Random question. I'm looking at windows backup, just investigating it really rather than anything serious. It wants to backup my system reserved, C: (windows partition) and D: (random programs) drives. It seems to think D: is essential but I can't think of a reason why it needs it (visual studio?). How do I find out why?

I'm looking for a Synology NAS and SOS online back up solution.

I currently have a subscription to SOS that has been backing up folders on my PC. I recently ordered the NAS so everyone could upload their pictures and docs to it at anytime. I want to move the folders off my PC (wireless) onto the NAS (wired) but still have them back up in the "cloud". I have read that SOS only supports NAS backups if you have a business plan which I do not.

A couple ideas that I have:

1. Map a NAS folder as a network drive.
concerns: I'm not sure if SOS supports network drives. Also would this mean that the data path would go SOS - router - my PC - NAS? If so, that would bog down my wifi.

2. Sync select folders to my PC then have SOS back them up.
concerns: It would still bog down the wifi but not as much (one way wifi traffic). I'm also not sure of the best way or best program to do this. The built in program for the NAS only lets you do one folder which is kind of annoying.

Anyone have any thoughts?