Commands.page Logo

How to Enable wget Concurrent Connections on Ubuntu

This guide explains how to manage concurrent connections when downloading files on Ubuntu using wget. While the standard GNU wget version does not support multi-threaded downloads for single files, newer alternatives like wget2 and aria2c provide this functionality. You will learn the specific commands to enable parallel downloads and how to workaround limitations with standard wget scripts.

Standard Wget Limitations

The default version of wget installed on most Ubuntu systems (GNU Wget 1.21 or earlier) does not support splitting a single file into multiple concurrent connections. It establishes one connection per file. Attempting to use flags like --connections will result in an error because this option does not exist in the standard package. This limitation is by design to prevent overloading servers, but it can slow down downloads on high-latency networks.

Using Wget2 for Parallel Downloads

Wget2 is the successor to GNU Wget and supports HTTP/2 and multi-threaded downloads. To use concurrent connections, you must install wget2 if it is not already present on your system.

  1. Install wget2:

    sudo apt update
    sudo apt install wget2
  2. Download a file with 16 concurrent connections:

    wget2 -j 16 https://example.com/file.zip

The -j or --jobs flag specifies the number of parallel connections. Adjust the number based on your bandwidth and the server’s tolerance.

Alternative Using Aria2

If wget2 is unavailable, aria2 is a lightweight alternative available in the Ubuntu repositories that specializes in multi-connection downloads.

  1. Install aria2:

    sudo apt install aria2
  2. Download a file with 16 concurrent connections:

    aria2c -x 16 -s 16 https://example.com/file.zip

The -x flag sets the maximum number of connections to one server, and -s sets the number of split files.

Parallelizing Multiple Files with Standard Wget

If you must use standard wget to download multiple different files concurrently, you can run multiple instances in the background using a bash loop.

  1. Create a file list named urls.txt.

  2. Run the following command to process 5 URLs at a time:

    cat urls.txt | xargs -P 5 -n 1 wget

The -P 5 argument tells xargs to run up to 5 processes simultaneously. This does not speed up a single file download but increases throughput when downloading many separate files.