How to copy website – Hello guy’s, Welcome to another article, In this article I’m going to show you how to download all website files offline with Termux or Linux completely Free.
We discuss about Two method’s to download website offline
- Wget
- Httrack
#1 Wget
Wget is a free software package for retrieving files using HTTP, HTTPS, FTP and FTPS, the most widely used Internet protocols. It is a non-interactive commandline tool, so it may easily be called from scripts, cron jobs, terminals without X-Windows support, etc.
Requirement’s
- Termux or Linux
- Wget package
How to copy website using Wget
Step 1: Install Wget
pkg install wget
Linux
sudo apt install wget
Step 2: Download Website
wget -mkEpnp ( your website )
Once complete this download process you can get all your website data offline. See this below Image
Detailed way
wget –mirror –convert-links –adjust-extension –page-requisites –no-parent ( your website)
#2 Httrack
HTTrack is a free (GPL, libre/free software) and easy-to-use offline browser utility.
It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site’s relative link-structure. Simply open a page of the “mirrored” website in your browser, and you can browse the site from link to link, as if you were viewing it online. HTTrack can also update an existing mirrored site, and resume interrupted downloads. HTTrack is fully configurable, and has an integrated help system.
Requirement’s
- Termux,Linux or Windows
- Httrack package or tool
How to copy website with Httrack
Step 1: Download Httrack
pkg install httrack
sudo apt install httrack
Windows
Httrack
Step 2: Download website
Post a Comment