The importance of backing up your computer can not be understated. This is an integral part of using a computer, judging by the outages of various cloud backup services like Microsoft Azure and other cloud services it is not a good idea to only use a cloud service like Google Drive or Skydrive to store backups. You should have a removable drive that can store backups as well as a cloud backup if you must. That way, if the cloud backup service is down, then you still have backups on a local drive. Windows has a backup facility in the Windows 7 release, but the Windows 8 release has neutered this facility for some unknown reason. The Ubuntu Linux distribution has the Ubuntu One system where you connect your ~/Documents folder to the Ubuntu One cloud backup system and you may then copy or save files into this folder and they are automatically uploaded to the cloud service. But again, you need to ensure that any critical files are backed up to a local backup drive as well. This will make sure that your files are really safe. The rsync utility on Linux can easily ensure that your files are backed up. A backup script is very useful in a cron(8) job, this will backup your computer to an external hard drive on a regular basis. A very good way to gain the peace of mind that comes with a good backup plan. Of course with a Linux system; you have the many scripting languages that can be used to write a useful backup script.
Perl is a perfect scripting language for coding a suitable backup script; you can copy all of the files to a tar.bz2 file to compress the files and then copy that compressed archive to the backup drive. This example below will back up a folder to a compressed tar.gz file and this may then be copied to an external hard disk drive. The files are time-stamped; so that you may know when they were created. And this simple script even checks whether the files were written properly.
#UPDATED.
#!/usr/bin/perl -W use strict; use POSIX ("strftime"); use File::stat; # A script to backup some files. my $homedir = "$ENV{'HOME'}"; my $user = $ENV{'LOGNAME'}; my $date = strftime("%A-%d-%B-%Y-%H-%M-%S", localtime); my $backupdrive = "/tmp"; printf("Which directory do you want to backup? Type a path to continue.\n\n"); chomp(my $dir = <STDIN>); if(!$dir) { printf("No Directory selected!\n"); exit; } else { my $target="$dir"; my $file = "$homedir/$user-$date.tgz"; system("tar -cvf $file $target"); printf("\n\nSuccessful backup of directory: $dir.\n"); # Naming the outfile. my $outfile="$user-$date.tgz"; # Moving the backup file to the destination. printf("Moving the archive $outfile to the backup drive.\n"); system("mv $outfile $backupdrive"); # Checking to see if the backup file exists. my $filename = "$backupdrive/$outfile"; if (-e $filename) { printf("\n\nBackup file exists. Great Success!\n"); my $stat = stat($filename)->size; printf("The filsesize is: %s KB.\n", $stat); } exit; } |
To get information on how much data your home directory is taking up, use this simple one-liner.
jason@jason-Lenovo-H50-55:~$ du -shc --apparent-size * | sort -hr 619M total 225M Desktop 221M Documents 145M Music 16M annex.Altis.pbo 12M Videos 1.7M youtube-dl 779K MP_Warlords_01_large.Altis.pbo 4.0K Templates 4.0K Steam 4.0K Public 4.0K Pictures 4.0K Downloads 0 nohup.out |
This is another example, this is how much data is in one directory recursively.
jason@jason-Lenovo-H50-55:~/Documents$ du -shc --apparent-size * | sort -hr 221M total 217M systemd 2.4M ballgirl.webm 1.5M ballreaction.webm 236K message.txt 4.0K this_is_a_dir_ 275 testing.c 101 nc.nasm 9 NC |
A very useful tip to see how much data your files are taking up.
Yet another way is to use the ncdu utility, this is a good utility to view the contents of a folder and the disk space taken up by them.
Type this command to install this utility.
jason@jason-Lenovo-H50-55:~/Documents$ sudo apt install ncdu |