Backup Site Recursively from FTP with Wget
Backing up your WebSite is a necessary step for all users.
This article describes how to recursively download your WebSite with all files, directories and sub-directories from FTP server, using Wget utility.
First of all create a folder in which you are going to download a site. For example, let's create the folder
# cd ~/backups
Download Entire Site from FTP
The following command recursively downloads your site with all its files and folders from FTP server and saves them to the current directory.
|ftp.server.com||IP address or domain name of an FTP server|
|-r, --recursive||Recursive retrieving|
|-l, --level||Maximum recursion depth (0 = unlimit)|
|-nH, --no-host-directories||Disable generation of host-prefixed directories|
Your site has been downloaded :
drwxr-xr-x 4 user group 4096 2013-05-09 18:20 yoursite.com
Backup Downloaded Site
Now you can compress the folder with your site as follows :
The previous command creates an archive, named something like
To extract the archive, type :
Download a Particular Folder from FTP
Let's say we have the following structure in FTP home directory :
The following command recursively downloads 'images' folder, with all its content from FTP server and saves it to the current directory.
|-np, --no-parent||Don't ascend to the parent directory|
|--cut-dirs||Ignore 'number' parent directories|
All these tasks have to be automated and added to Cron. The HowTo is coming ...