Help - Search - Members - Calendar
Full Version: Making Of A Backup Script
Linuxhelp > Support > Technical Support
Robert83
Hello,

I'm creating a backup script here and have some little problems.

my script looks like this (sorta...it's longer)

CODE
#!/bin/bash
PATH=/bin:/sbin:/usr/bin:/usr/sbin

renice -10 -p $
mkdir /share3/backup

mkdir /share3/n-tibor
mount -t smbfs -o guest //n-tibor/My' 'Documents /share3/n-tibor
cp -au /share3/n-tibor /share3/backup
umount /share3/n-tibor
rmdir /share3/n-tibor

this works ok, it copies everything from those share to the local backup folder...and later it only copies the files that are missing or changed.

the problem part is this...and I cant seem to solve it for some reason

when I did tar -jcf /share3/backupDVD/backup-gepek.tar.bzip /share3/backup it was ok in size
when I did tar -cZf /share3/backupDVD/backup-gepek.tgz /share3/backup was to big

what I would like to do is the following archive those files in such a way that they are compressed and can be uncompressed with a windows based computer if necesary...

also how do you use that -g or -G switch?

I would like the program to only modify the changed part of the archive not create it again and again.

Thank you

Sincerely
Robert B
chrisw
the -j, --bzip2 filter the archive through bzip2
the -Z, --compress, --uncompress filter the archive through compress

i believe the -j option compresses the data more than 'compress' does

how i usually create tar files is running it through the gzip compression using the -z option
or you can just

tar the file using the -cvf flags

then run the file through the gzip program with either the --fast or --best flags
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2017 Invision Power Services, Inc.