bash wrap for dd | split | gzip ?

Code junkies hangout here

Moderators: ChriThor, LXF moderators

bash wrap for dd | split | gzip ?

Postby silly_xp_user » Sat Nov 04, 2006 3:52 pm


Can anyone please help...

I have 2 ways of backing up a drive (with the output split) to a drive on a network:-

Code: Select all
Method A)
dd bs=16M if=/dev/hda | gzip -c | split -b 256m - \ \mnt\nwdrive\bak.img.gz.

gives portions of a gzipped hard drive in \mnt\nwdrive\bak.img.gz.*

Code: Select all
Method B)
dd bs=16M if=/dev/hda |  split -b 256m - \mnt\nwdrive\bak.img.

gzip  \mnt\nwdrive\bak.img.*

gives gzipped portions of a hard drive in bak.img.*.gz

However all intermediate back.img.* files are written to disk so need the space to hold them all. I don't have the space locally or on the network drive.

Method A) is the more usual way, it gzips the entire drive and then splits the output.
Method B) splits the dd of the entire drive then gzips those splits.

I prefer Method B) but I am limited on space and bandwidth.

Are there any obvious way of achieving the same thing as B) but not having to use all that space by varying parameters to dd or doing some bash scripting? Somehow I need to create and gzip one split at a time.

If at all possible I would prefer to stick to these tools plus the shell, I am not interested in G4U, ghost, etc.

Thanks for any constructive help given.

User avatar
Posts: 11
Joined: Sat Oct 22, 2005 3:55 pm

RE: bash wrap for dd | split | gzip ?

Postby MartyBartfast » Sat Nov 04, 2006 4:59 pm

Never used dd, so can't really comment, but I can offer another option:

/sbin/dump -0 -z9 -M -B 256000 -f bak.dump /usr

will give you a compressed multi volume dump of /usr in bak.dump001 bak.dump002 etc with each portion being 256 Meg,
you then need restore to recover files from the dumps
I have been touched by his noodly appendage.
LXF regular
Posts: 842
Joined: Mon Aug 22, 2005 7:25 am
Location: Hants, UK

RE: bash wrap for dd | split | gzip ?

Postby nelz » Sat Nov 04, 2006 6:23 pm

I don't think dd is the right tool for this job, because you are copying every bit on the disk, even the no longer used spaces. I'd use partimage for this, which works well over a network, only backs up the parts of the disk that are in use and is capable of splitting and gzipping the files.

However, if you do want to use dd, you aren't quite correct in stating that A "gzips the entire drive and then splits the output". The use of pipes mean that dd, gzip and split are operating simultaneously. One of the advantage of this is that the data is compressed before transfer over the network, which is usually faster, unless you have a very fast network and are backing up a slow machine. In that case you could use B with a file notification program running on the destination machine, something like fileschanged which will detect each new backup file and can be told to gzip it.
"Insanity: doing the same thing over and over again and expecting different results." (Albert Einstein)
User avatar
Site admin
Posts: 9046
Joined: Mon Apr 04, 2005 11:52 am
Location: Warrington, UK

Postby wdlerner » Wed Jan 03, 2007 2:42 am

Your block size seems a bit large. Can you explain why you are using a 16 Megabyte block size?

Posts: 2
Joined: Wed Jan 03, 2007 2:24 am
Location: Virginia, USA

Return to Programming

Who is online

Users browsing this forum: No registered users and 1 guest