Table of contents

What is the purpose of the --compressed option in Curl?

The --compressed option in cURL enables automatic compression support for HTTP requests and responses. This feature can significantly reduce bandwidth usage and improve transfer speeds, especially for large text-based content like HTML, JSON, or XML.

How It Works

When you use --compressed, cURL automatically:

  1. Sends Accept-Encoding header: Tells the server which compression algorithms it supports (gzip, deflate, brotli)
  2. Receives compressed response: The server compresses the response if it supports compression
  3. Automatically decompresses: cURL transparently decompresses the data for you

Basic Usage

# Enable compression for a simple GET request
curl --compressed https://api.example.com/data

# Save compressed response to file (already decompressed)
curl --compressed -o output.html https://www.example.com

# Combine with other options
curl --compressed -H "User-Agent: MyApp" https://api.github.com/users/octocat

Bandwidth Savings Example

Let's see the difference compression makes:

# Without compression - check response size
curl -w "Size: %{size_download} bytes\n" https://httpbin.org/html

# With compression - typically 60-80% smaller
curl --compressed -w "Size: %{size_download} bytes\n" https://httpbin.org/html

Viewing Compression Headers

To see compression in action, include headers in the output:

curl --compressed -i https://httpbin.org/gzip

Look for these headers in the response: - Content-Encoding: gzip - Server used gzip compression - Accept-Encoding: gzip, deflate - cURL advertised supported algorithms

API Usage

Particularly useful for API calls that return large JSON responses:

# Fetch large JSON dataset with compression
curl --compressed \
  -H "Accept: application/json" \
  https://api.example.com/large-dataset

# POST with compressed response
curl --compressed \
  -X POST \
  -H "Content-Type: application/json" \
  -d '{"query": "large dataset"}' \
  https://api.example.com/search

Requirements

  • cURL must be compiled with zlib support (standard in most distributions)
  • Server must support compression (most modern web servers do)
  • Works with HTTP/HTTPS requests

Alternatives in Other Languages

Python with requests

import requests

# requests handles compression automatically by default
response = requests.get('https://api.example.com/data')
print(f"Content-Encoding: {response.headers.get('Content-Encoding', 'none')}")

Node.js

const axios = require('axios');

// axios supports compression by default
const response = await axios.get('https://api.example.com/data');
console.log('Content-Encoding:', response.headers['content-encoding']);

When to Use

The --compressed option is most beneficial for: - Large text-based responses (HTML, JSON, XML, CSS, JavaScript) - Slow network connections - APIs with verbose responses - Batch operations where bandwidth matters

Note: Binary files (images, videos, already-compressed files) won't benefit significantly from HTTP compression.

Try WebScraping.AI for Your Web Scraping Needs

Looking for a powerful web scraping solution? WebScraping.AI provides an LLM-powered API that combines Chromium JavaScript rendering with rotating proxies for reliable data extraction.

Key Features:

  • AI-powered extraction: Ask questions about web pages or extract structured data fields
  • JavaScript rendering: Full Chromium browser support for dynamic content
  • Rotating proxies: Datacenter and residential proxies from multiple countries
  • Easy integration: Simple REST API with SDKs for Python, Ruby, PHP, and more
  • Reliable & scalable: Built for developers who need consistent results

Getting Started:

Get page content with AI analysis:

curl "https://api.webscraping.ai/ai/question?url=https://example.com&question=What is the main topic?&api_key=YOUR_API_KEY"

Extract structured data:

curl "https://api.webscraping.ai/ai/fields?url=https://example.com&fields[title]=Page title&fields[price]=Product price&api_key=YOUR_API_KEY"

Try in request builder

Get Started Now

WebScraping.AI provides rotating proxies, Chromium rendering and built-in HTML parser for web scraping
Icon