How do I use Curl to send multiple requests?

Curl is a powerful tool used to transfer data from or to a network server, using one of the supported protocols (HTTP, HTTPS, FTP, and more). It is often used for testing RESTful APIs or for creating scripts to automate HTTP requests.

To send multiple requests using curl, you can simply repeat the curl command with the desired URL multiple times in your command line or bash script.

Shell script:

#!/bin/bash

# URL to send requests to
url="http://my.website.com/api"

# Send 5 requests to the URL
for i in {1..5}
do
   curl $url
done

This shell script will send five GET requests to the specified URL.

However, if you want to send multiple requests concurrently, you can use curl's -Z, --parallel option. It allows you to send multiple HTTP requests in parallel.

Here is an example of how you can use it:

#!/bin/bash

# URL to send requests to
url1="http://my.website.com/api1"
url2="http://my.website.com/api2"
url3="http://my.website.com/api3"

# Send 3 requests concurrently
curl -Z -X GET $url1 -o output1.txt &
curl -Z -X GET $url2 -o output2.txt &
curl -Z -X GET $url3 -o output3.txt &
wait

This script sends 3 GET requests concurrently to the specified URLs. The -o option is used to save the output of each request to a separate file.

Please note that the -Z, --parallel option is relatively new and may not be available in older versions of curl.

Remember to replace http://my.website.com/api with the URL you wish to send a request to.

As always, be careful not to overload a server with too many concurrent requests. This could cause the server to become unresponsive or even crash. Always ensure that you respect the rate limits and terms of use of the server you are sending requests to.

Related Questions

Get Started Now

WebScraping.AI provides rotating proxies, Chromium rendering and built-in HTML parser for web scraping
Icon