Download All Images from a Webpage Using Wget on Ubuntu
This guide explains how to use the wget command in Ubuntu to batch download images from a specific website. You will learn the exact syntax required to filter image files and retrieve them efficiently without manual saving.
Prerequisites
Ensure wget is installed on your Ubuntu system. Open your terminal and run the following command to install it if necessary:
sudo apt install wgetThe Command
To download all images from a specific webpage, use the following command:
wget -r -A .jpg,.jpeg,.png,.gif -e robots=off [URL]Replace [URL] with the actual address of the webpage you
wish to scrape.
Explanation of Flags
- -r: Enables recursive downloading to follow links on the page.
- -A: Specifies the accept list to filter files ending in .jpg, .jpeg, .png, or .gif.
- -e robots=off: Ignores the robots.txt file which might otherwise prevent scraping.
- [URL]: The target website address.
Example
To download images from example.com, run:
wget -r -A .jpg,.jpeg,.png,.gif -e robots=off https://www.example.comThe images will be saved in a directory structure mirroring the website within your current folder.