How to Download Large Files in Chunks Using Wget on Ubuntu
This article outlines the process of downloading large files on Ubuntu using the wget command-line tool. It details how to configure wget to resume interrupted transfers and limit bandwidth usage for better stability. By following these instructions, you can manage large downloads efficiently without third-party tools.
Install Wget
Wget is typically pre-installed on Ubuntu. To verify installation, open your terminal and run the following command:
wget --versionIf it is not installed, update your package list and install it using:
sudo apt update
sudo apt install wgetEnable Resume Functionality
The primary method for handling large files in wget is using the
continue flag. This allows you to resume a download if your connection
drops, effectively letting you download the file in separate sessions.
Use the -c or --continue option followed by
the file URL:
wget -c https://example.com/largefile.zipIf the download interrupts, running the same command again will pick up where it left off instead of restarting from zero.
Limit Download Speed
To prevent the download from consuming all available bandwidth, you
can limit the download rate. This ensures your system remains responsive
during the transfer. Use the --limit-rate flag with a value
like k (kilobytes) or m (megabytes):
wget -c --limit-rate=500k https://example.com/largefile.zipUnderstanding Chunking Limitations
It is important to note that wget downloads files sequentially, not
in parallel chunks. While the -c flag allows you to break
the download into time-based segments by resuming, it does not split the
file into multiple concurrent streams. For true parallel chunk
downloading, the aria2c utility is recommended, but wget
remains the standard tool for reliable sequential downloads with resume
capabilities on Ubuntu.