Home > Network Management > wget

wget -i: Download URLs from a File

`wget -i` is used to read a list of URLs from a text file and download the files sequentially. It is very useful for batch downloading large numbers of files or processing dynamically generated URL lists from scripts. Each URL should be on a separate line in the file.

Overview

`wget -i` reduces the hassle of manually entering each URL when you need to download multiple files at once. It is particularly efficient when downloading results from web crawling or files that match a specific pattern. This command processes all URLs in the specified file sequentially, and you can combine it with other `wget` options for each download.

Key Features

  • Batch processing of URL lists
  • Easy integration with scripts
  • Supports download resuming
  • Can be combined with various `wget` options

Common Options

Options frequently used with `wget -i`.

Input/Output Control

Generated command:

Try combining the commands.

Description:

`wget` Executes the command.

Combine the above options to virtually execute commands with AI.

Usage Examples

Various scenarios using the `wget -i` command.

Example of creating a URL file

echo "http://example.com/file1.zip\nhttp://example.com/image.jpg\nhttps://www.gnu.org/software/wget/manual/wget.pdf" > urls.txt

Creates a `urls.txt` file containing a list of URLs to download.

Basic usage

wget -i urls.txt

Downloads all URLs listed in the `urls.txt` file to the current directory.

Download to a specific directory

wget -i urls.txt -P /data/downloads

Saves downloaded files to the `/data/downloads` directory.

Resume downloads and log output

wget -i urls.txt -c -o wget_log.txt

Resumes interrupted downloads and logs all progress and errors to `wget_log.txt`.

Limit download speed and avoid overwriting files

wget -i urls.txt --limit-rate=500k -nc

Limits the download speed to 500KB/s and skips existing files instead of overwriting them.

Tips & Precautions

Tips for increasing efficiency and preventing potential issues when using `wget -i`.

Useful Tips

  • **URL File Format**: Each line must contain a single URL. Empty lines or comments (usually starting with `#`) are automatically ignored by `wget`.
  • **Resuming Downloads**: The `-c` option allows you to resume interrupted downloads, which is very useful for large files or in unstable network environments.
  • **Checking Logs**: Using the `-o` option to create a log file helps track download progress, errors, etc., which is very helpful for troubleshooting.
  • **Parallel Downloads**: `wget -i` itself does not support parallel downloads. To download multiple files simultaneously, consider combining it with other tools like `xargs -P` or using a parallel download manager like `aria2c`.
  • **Preventing File Overwrites**: The `-nc` (no-clobber) option prevents overwriting existing files, thus avoiding accidental corruption of important data.

Same category commands