Using curl and wget
Introduction
In the realm of command line tools, curl
and wget
are two of the most commonly used utilities for downloading content from the web. They are powerful, flexible, and support a wide range of protocols. This tutorial will guide you through the basic and advanced usage of these tools with practical examples.
Getting Started with curl
curl
is a command line tool for transferring data with URLs. It supports numerous protocols including HTTP, HTTPS, FTP, and more. Let's start with a simple example of how to download a web page using curl.
curl http://example.com
This command fetches the content from the provided URL and prints it to the standard output (your terminal). To save the output to a file, use the -o
option:
curl -o example.html http://example.com
This command saves the content to a file named example.html
.
Advanced curl Usage
curl offers a plethora of options for more complex tasks. Here are a few advanced examples:
1. Downloading Multiple Files
To download multiple files, you can use multiple -O
options:
curl -O http://example.com/file1 -O http://example.com/file2
2. Resuming Downloads
If a download is interrupted, you can resume it using the -C -
option:
curl -C - -O http://example.com/largefile
3. Sending POST Requests
To send a POST request with data, use the -d
option:
curl -d "param1=value1¶m2=value2" -X POST http://example.com/form
Getting Started with wget
wget
is another command line utility for downloading files from the web. It is particularly useful for downloading large files or recursively downloading entire directories.
wget http://example.com/file
This command downloads the specified file to the current directory.
Advanced wget Usage
wget also offers many advanced options for more complex scenarios:
1. Downloading Files to a Specific Directory
Use the -P
option to specify the directory to save the downloaded file:
wget -P /path/to/directory http://example.com/file
2. Recursive Downloads
To download an entire website recursively, use the -r
option:
wget -r http://example.com
3. Limiting Download Speed
To limit the download speed, use the --limit-rate
option:
wget --limit-rate=200k http://example.com/largefile
Conclusion
Both curl
and wget
are powerful tools for downloading content from the web. While curl is more versatile with its support for a wide range of protocols and options, wget is excellent for downloading large files and entire directories. By mastering these tools, you can significantly enhance your command line capabilities for networking tasks.