How do I Set Up Proxy Support in n8n Automation?
Setting up proxy support in n8n automation workflows is essential for web scraping projects that require IP rotation, geographic location targeting, or bypassing rate limits. n8n provides multiple approaches to configure proxies depending on the nodes and tools you're using.
Understanding Proxy Types
Before configuring proxies in n8n, it's important to understand the different proxy types:
- HTTP/HTTPS Proxies: Standard proxies that work with web traffic
- SOCKS Proxies: More versatile proxies that can handle various protocols
- Rotating Proxies: Services that automatically rotate IP addresses for each request
- Residential Proxies: IP addresses from real residential devices, harder to detect
- Datacenter Proxies: Faster but more easily detected by anti-bot systems
Method 1: Using HTTP Request Node with Proxy
The simplest way to use proxies in n8n is through the HTTP Request node, which has built-in proxy support.
Configuration Steps
- Add an HTTP Request node to your workflow
- Configure the URL and request method
- Scroll down to Proxy section
- Enter your proxy URL in the format:
http://username:password@proxy-host:port
Example Workflow
{
"nodes": [
{
"parameters": {
"url": "https://example.com",
"options": {
"proxy": "http://user:pass@proxy.example.com:8080"
}
},
"name": "HTTP Request",
"type": "n8n-nodes-base.httpRequest"
}
]
}
Environment Variables for Proxies
For security and flexibility, store proxy credentials as environment variables:
# In your n8n environment
export HTTP_PROXY=http://username:password@proxy.example.com:8080
export HTTPS_PROXY=http://username:password@proxy.example.com:8080
Then reference them in your workflow:
// In a Code node
const proxy = process.env.HTTP_PROXY;
return { proxy };
Method 2: Puppeteer Node with Proxy Support
When using the Puppeteer node for browser automation, you can configure proxies through launch arguments. This is particularly useful when handling browser sessions in Puppeteer for scraping tasks.
Puppeteer Proxy Configuration
// In n8n Puppeteer node
{
"headless": true,
"args": [
"--proxy-server=http://proxy.example.com:8080",
"--disable-web-security"
]
}
Complete Puppeteer Example with Authentication
// Code node before Puppeteer
const puppeteer = require('puppeteer');
const browser = await puppeteer.launch({
headless: true,
args: [
'--proxy-server=http://proxy.example.com:8080'
]
});
const page = await browser.newPage();
// Authenticate with proxy
await page.authenticate({
username: 'your_username',
password: 'your_password'
});
// Navigate to target page
await page.goto('https://example.com');
const content = await page.content();
await browser.close();
return { content };
Method 3: Using Code Node with Custom Proxy Libraries
For advanced proxy management, use a Code node with libraries like axios
or got
that support proxies.
Node.js Example with Axios
const axios = require('axios');
const HttpsProxyAgent = require('https-proxy-agent');
// Configure proxy agent
const proxyUrl = 'http://username:password@proxy.example.com:8080';
const agent = new HttpsProxyAgent(proxyUrl);
// Make request through proxy
const response = await axios.get('https://example.com', {
httpsAgent: agent,
headers: {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36'
}
});
return { data: response.data };
Python Code Node Example
import requests
# Configure proxy
proxies = {
'http': 'http://username:password@proxy.example.com:8080',
'https': 'http://username:password@proxy.example.com:8080'
}
# Make request through proxy
response = requests.get(
'https://example.com',
proxies=proxies,
headers={
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36'
}
)
return {'data': response.text}
Method 4: Rotating Proxy Services
When working with rotating proxies with n8n workflows, you can integrate services like Bright Data, Oxylabs, or Smartproxy.
Rotating Proxy Example
// Code node for rotating proxy
const axios = require('axios');
// Rotating proxy endpoint
const proxyUrl = 'http://customer-USER:PASS@pr.oxylabs.io:7777';
const makeRequest = async (url) => {
const response = await axios.get(url, {
proxy: {
host: 'pr.oxylabs.io',
port: 7777,
auth: {
username: 'customer-USER',
password: 'PASS'
}
}
});
return response.data;
};
// Each request automatically uses a different IP
const data = await makeRequest('https://example.com');
return { data };
Method 5: WebScraping.AI API with Built-in Proxy Support
For a simplified approach, use WebScraping.AI API which handles proxy rotation and management automatically:
JavaScript Example
// Code node with WebScraping.AI
const axios = require('axios');
const response = await axios.get('https://api.webscraping.ai/html', {
params: {
api_key: 'YOUR_API_KEY',
url: 'https://example.com',
proxy: 'datacenter', // or 'residential'
country: 'us'
}
});
return { html: response.data };
Python Example
import requests
response = requests.get(
'https://api.webscraping.ai/html',
params={
'api_key': 'YOUR_API_KEY',
'url': 'https://example.com',
'proxy': 'datacenter',
'country': 'us'
}
)
return {'html': response.text}
Best Practices for Proxy Usage in n8n
1. Implement Retry Logic
// Code node with retry logic
const maxRetries = 3;
let attempt = 0;
while (attempt < maxRetries) {
try {
const response = await makeProxyRequest(targetUrl);
return { success: true, data: response };
} catch (error) {
attempt++;
if (attempt >= maxRetries) {
throw error;
}
await new Promise(resolve => setTimeout(resolve, 1000 * attempt));
}
}
2. Proxy Pool Management
// Manage multiple proxies
const proxyPool = [
'http://user:pass@proxy1.example.com:8080',
'http://user:pass@proxy2.example.com:8080',
'http://user:pass@proxy3.example.com:8080'
];
// Rotate through proxies
const getRandomProxy = () => {
return proxyPool[Math.floor(Math.random() * proxyPool.length)];
};
const proxy = getRandomProxy();
3. Proxy Health Checking
// Test proxy before using
const testProxy = async (proxyUrl) => {
try {
const response = await axios.get('https://api.ipify.org?format=json', {
proxy: parseProxyUrl(proxyUrl),
timeout: 5000
});
return { working: true, ip: response.data.ip };
} catch (error) {
return { working: false, error: error.message };
}
};
Troubleshooting Common Proxy Issues
Connection Timeouts
When handling timeouts in Puppeteer or HTTP requests, increase timeout values:
const response = await axios.get(url, {
proxy: proxyConfig,
timeout: 30000 // 30 seconds
});
Authentication Errors
Ensure credentials are properly URL-encoded:
const username = encodeURIComponent('user@example.com');
const password = encodeURIComponent('p@ssw0rd!');
const proxyUrl = `http://${username}:${password}@proxy.example.com:8080`;
SSL Certificate Issues
For HTTPS proxies, you may need to disable SSL verification (use cautiously):
const agent = new HttpsProxyAgent(proxyUrl);
agent.options.rejectUnauthorized = false;
Advanced Configuration: SOCKS Proxy
For SOCKS proxy support:
const SocksProxyAgent = require('socks-proxy-agent');
const socksProxy = 'socks5://username:password@proxy.example.com:1080';
const agent = new SocksProxyAgent(socksProxy);
const response = await axios.get('https://example.com', {
httpsAgent: agent
});
Testing Your Proxy Configuration
Create a simple test workflow to verify proxy functionality:
// Test node to verify proxy IP
const axios = require('axios');
const response = await axios.get('https://api.ipify.org?format=json', {
proxy: {
host: 'proxy.example.com',
port: 8080,
auth: {
username: 'user',
password: 'pass'
}
}
});
return {
proxyIP: response.data.ip,
message: 'Proxy is working correctly'
};
Conclusion
Setting up proxy support in n8n automation workflows is straightforward once you understand the different methods available. Whether you're using the built-in HTTP Request node, configuring browser automation with Puppeteer for web scraping, or implementing custom solutions with Code nodes, n8n provides the flexibility to integrate proxies effectively.
For production workflows, consider using managed proxy services or APIs like WebScraping.AI that handle proxy rotation, IP management, and anti-bot bypassing automatically, allowing you to focus on extracting the data you need rather than managing proxy infrastructure.
Remember to always respect website terms of service and robots.txt files when implementing web scraping workflows, and use proxies ethically and legally.