Backing up your WebSite is a necessary step for all users.
This article describes how to recursively download your WebSite with all files, directories and sub-directories from FTP server, using Wget utility.
First of all create a folder in which you are going to download a site. For example, let’s create the folder
backups in a home directory.
# mkdir ~/backups # cd ~/backups
Download Entire Site from FTP
The following command recursively downloads your site with all its files and folders from FTP server and saves them to the current directory.
# wget -r -l 0 -nH ftp://user:firstname.lastname@example.org
|ftp.server.com||IP address or domain name of an FTP server|
|-r, –recursive||Recursive retrieving|
|-l, –level||Maximum recursion depth (0 = unlimit)|
|-nH, –no-host-directories||Disable generation of host-prefixed directories|
Your site has been downloaded:
# ls -l drwxr-xr-x 4 user group 4096 2013-05-09 18:20 yoursite.com
Backup Downloaded Site
Now you can compress the folder with your site as follows:
# tar -czf site-backup-$(date +%Y%m%d-%H%M%S).tar.gz yoursite.com
The previous command creates an archive, named something like
To extract the archive, type:
# tar -zxvf site-backup-20130509-190638.tar.gz
Download a Particular Folder from FTP
Let’s say we have the following structure in FTP home directory:
The following command recursively downloads ‘images’ folder, with all its content from FTP server and saves it to the current directory.
# wget -r -l 0 -nH -np --cut-dirs=2 ftp://user:email@example.com/yoursite.com/www/images
|-np, –no-parent||Don’t ascend to the parent directory|
|–cut-dirs||Ignore ‘number’ parent directories|
All these tasks have to be automated and added to Cron. The HowTo is coming …
One Reply to “Backup Site Recursively from FTP with Wget”
Very nice! Thanks.