Experimental Compression method for folders

Do show you here a method to compress folders, that is not recommended for very very very very and very important backups, but might be interesting to decrease file size for transferring files etc. etc. :

first you need to install lrzip tar xz-utils and sed:

Code:
sudo apt-get install lrzip tar xz-utils sed

Now create a file named blxcompress.sh or any other name you want it to have with following content:

Code:
#/bin/bash
echo Compressing folder $1
tar cvf $1.tar $1
lrzip -n -o $1.tlrz $1.tar
xz -z $1.tlrz
rm $1.tar

Now you can compress a folder by typing ./blxcompress.sh folder_name

If you want to decompress create the following ./blxdecompress.sh script:

Code:
#/bin/bash
fname=`echo "$1" | sed 's/........$//'`
echo You compressed the folder $fname
cp $fname.tlrz.xz $fname-copy.tlrz.xz
xz -d $fname-copy.tlrz.xz
lrzip -d -o $fname-copy.tar $fname-copy.tlrz
tar -xvf $fname-copy.tar
rm $fname-copy.tar
rm $fname-copy.tlrz
echo folder $fname extracted sucessfully

Usage: ./blxdecompress.sh folder_name.tlrz.xz

This compression method may be better than normal xz compression. The reason is that it does search for repeating patterns in a distance of several hundreds of MB. Normal gzip and xz can`t do this.

lrz is a good compression format if you ahve files that contents repeat very often. This method does devide the process up in three steps and that is more stable than letting lrzip do all. lrzip gets really slow on my on if i try to use itas xz method. If i do use xz after lrzip it does work very well.

Tought it is ideal for compression of 1-4 GB files and if you have time for compression and decompression since its really slow but has sometimes nearly the same ratio as normal zpaq which is much slower.

You might „compile these scripts with shc and put them in /usr/bin if you like to do that!

regards bluedxca93

Posted by: bluedxca93-web-blog.coolpage.biz on