How do I use Curl to handle JSON responses?
JSON (JavaScript Object Notation) is the most common data format for modern web APIs. When working with REST APIs or web services, you'll frequently need to handle JSON responses using curl. This guide covers everything you need to know about receiving, parsing, and processing JSON data with curl commands.
Basic JSON Response Handling
The simplest way to receive a JSON response is to make a GET request to an API endpoint:
curl -X GET https://api.example.com/users
This will output the raw JSON response to your terminal. However, for better readability and processing, you'll want to use additional options and tools.
Pretty-Printing JSON Responses
Raw JSON responses are often minified and difficult to read. You can pipe curl output to jq
for beautiful formatting:
curl -s https://api.example.com/users | jq '.'
The -s
(silent) flag suppresses progress information, giving you clean JSON output. If jq
isn't available, you can use Python's built-in JSON formatter:
curl -s https://api.example.com/users | python -m json.tool
Setting Proper Headers for JSON
When requesting JSON data, it's good practice to set the Accept
header to specify that you expect JSON:
curl -H "Accept: application/json" https://api.example.com/users
For POST requests with JSON data, you'll need to set the Content-Type
header:
curl -X POST \
-H "Content-Type: application/json" \
-H "Accept: application/json" \
-d '{"name": "John Doe", "email": "john@example.com"}' \
https://api.example.com/users
Sending JSON Data in Requests
Using Inline JSON
You can include JSON data directly in your curl command:
curl -X POST \
-H "Content-Type: application/json" \
-d '{"username": "testuser", "password": "secret123"}' \
https://api.example.com/login
Using JSON from File
For larger JSON payloads, store the data in a file and reference it:
# Create user.json file
echo '{"name": "Jane Smith", "email": "jane@example.com", "role": "admin"}' > user.json
# Send the file contents
curl -X POST \
-H "Content-Type: application/json" \
-d @user.json \
https://api.example.com/users
Extracting Specific Values from JSON
Using jq
, you can extract specific fields from JSON responses:
# Get just the user names
curl -s https://api.example.com/users | jq '.[].name'
# Get the first user's email
curl -s https://api.example.com/users | jq '.[0].email'
# Get users with specific criteria
curl -s https://api.example.com/users | jq '.[] | select(.role == "admin")'
Handling Authentication with JSON APIs
Bearer Token Authentication
Many APIs use Bearer tokens for authentication:
curl -H "Authorization: Bearer YOUR_TOKEN_HERE" \
-H "Accept: application/json" \
https://api.example.com/protected-resource
Basic Authentication
For APIs requiring basic authentication:
curl -u username:password \
-H "Accept: application/json" \
https://api.example.com/users
Error Handling and Status Codes
When working with JSON APIs, it's crucial to handle HTTP status codes properly:
# Check status code and show response
curl -w "\nHTTP Status: %{http_code}\n" \
-H "Accept: application/json" \
https://api.example.com/users
# Only show output for successful requests
curl -f -s https://api.example.com/users | jq '.'
The -f
flag makes curl fail silently on server errors (HTTP 4xx and 5xx responses).
Saving JSON Responses
Save to File
curl -s https://api.example.com/users -o users.json
Save with Response Headers
curl -i https://api.example.com/users -o response_with_headers.txt
Advanced JSON Processing Examples
Pagination Handling
Many APIs use pagination. Here's how to handle paginated JSON responses:
# Get first page
curl -s "https://api.example.com/users?page=1&limit=10" | jq '.'
# Extract pagination info
curl -s "https://api.example.com/users?page=1&limit=10" | jq '.pagination'
# Loop through pages (bash script)
#!/bin/bash
page=1
while true; do
response=$(curl -s "https://api.example.com/users?page=$page&limit=10")
users=$(echo "$response" | jq '.data[]')
if [ "$users" = "null" ] || [ -z "$users" ]; then
break
fi
echo "$users"
((page++))
done
Filtering and Transforming Data
# Transform JSON structure
curl -s https://api.example.com/users | \
jq '.[] | {fullName: .name, contact: .email, isActive: .status == "active"}'
# Count specific items
curl -s https://api.example.com/users | \
jq '[.[] | select(.role == "admin")] | length'
Integration with Other Tools
Using with grep for Simple Filtering
# Find users with specific domain
curl -s https://api.example.com/users | grep -o '"email":"[^"]*@company\.com"'
Combining with sed for Quick Replacements
# Replace API domain in URLs within JSON
curl -s https://api.example.com/users | \
sed 's/api\.example\.com/api\.newdomain\.com/g'
Real-World Example: Weather API
Here's a practical example using a weather API:
# Get weather data
weather_data=$(curl -s "https://api.openweathermap.org/data/2.5/weather?q=London&appid=YOUR_API_KEY")
# Extract specific information
temperature=$(echo "$weather_data" | jq '.main.temp')
description=$(echo "$weather_data" | jq -r '.weather[0].description')
humidity=$(echo "$weather_data" | jq '.main.humidity')
echo "Temperature: $temperature K"
echo "Description: $description"
echo "Humidity: $humidity%"
Testing JSON APIs with curl
When developing or testing APIs, you can create comprehensive test scripts:
#!/bin/bash
API_BASE="https://api.example.com"
TOKEN="your_auth_token"
# Test user creation
echo "Creating user..."
create_response=$(curl -s -X POST \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"name": "Test User", "email": "test@example.com"}' \
"$API_BASE/users")
user_id=$(echo "$create_response" | jq -r '.id')
echo "Created user with ID: $user_id"
# Test user retrieval
echo "Fetching user..."
curl -s -H "Authorization: Bearer $TOKEN" \
"$API_BASE/users/$user_id" | jq '.'
Common Issues and Solutions
Handling Special Characters
When your JSON contains special characters, ensure proper escaping:
# Use single quotes to preserve special characters
curl -X POST \
-H "Content-Type: application/json" \
-d '{"message": "Hello \"World\"!"}' \
https://api.example.com/messages
Large JSON Responses
For large JSON responses, consider using streaming and processing:
# Stream large responses and process incrementally
curl -s https://api.example.com/large-dataset | \
jq -c '.[]' | \
while read -r item; do
# Process each item individually
echo "$item" | jq '.importantField'
done
Best Practices
- Always set appropriate headers: Use
Accept: application/json
for requests expecting JSON - Handle errors gracefully: Check HTTP status codes and implement proper error handling
- Use jq for JSON processing: It's the most reliable tool for parsing and manipulating JSON
- Secure sensitive data: Never include API keys or passwords in command history
- Validate JSON: Use tools like
jq
to validate JSON structure before processing
When working with more complex web scraping scenarios that require JavaScript execution, you might want to consider using tools like Puppeteer for handling AJAX requests, especially when dealing with single-page applications that heavily rely on JSON APIs.
By mastering these curl techniques for JSON handling, you'll be well-equipped to work with modern web APIs and integrate them into your applications or scripts effectively.