Table of contents

How do I handle timeouts in Go HTTP requests?

Handling timeouts properly in Go HTTP requests is crucial for building robust and reliable applications. Without proper timeout configurations, your application can hang indefinitely waiting for unresponsive servers, leading to poor user experience and resource exhaustion. This guide covers various timeout strategies and best practices for Go HTTP clients.

Understanding Different Types of Timeouts

Go's HTTP client supports several types of timeouts, each serving a specific purpose:

1. Context Timeout (Recommended)

The most flexible and recommended approach is using context with timeout:

package main

import (
    "context"
    "fmt"
    "io"
    "net/http"
    "time"
)

func makeRequestWithContext() error {
    // Create a context with a 10-second timeout
    ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
    defer cancel()

    // Create the HTTP request
    req, err := http.NewRequestWithContext(ctx, "GET", "https://api.example.com/data", nil)
    if err != nil {
        return fmt.Errorf("creating request: %w", err)
    }

    // Make the request
    client := &http.Client{}
    resp, err := client.Do(req)
    if err != nil {
        return fmt.Errorf("making request: %w", err)
    }
    defer resp.Body.Close()

    // Read the response
    body, err := io.ReadAll(resp.Body)
    if err != nil {
        return fmt.Errorf("reading response: %w", err)
    }

    fmt.Printf("Response: %s\n", string(body))
    return nil
}

2. Client-Level Timeout

You can set a global timeout for all requests made by a specific client:

func createClientWithTimeout() *http.Client {
    return &http.Client{
        Timeout: 30 * time.Second,
    }
}

func makeSimpleRequest() error {
    client := createClientWithTimeout()

    resp, err := client.Get("https://api.example.com/data")
    if err != nil {
        return fmt.Errorf("request failed: %w", err)
    }
    defer resp.Body.Close()

    // Process response...
    return nil
}

3. Transport-Level Timeouts

For fine-grained control, configure timeouts at the transport level:

import "net"

func createCustomTransport() *http.Client {
    transport := &http.Transport{
        // Time to establish a TCP connection
        DialContext: (&net.Dialer{
            Timeout: 5 * time.Second,
        }).DialContext,

        // Time to perform TLS handshake
        TLSHandshakeTimeout: 10 * time.Second,

        // Time to wait for server's first response headers
        ResponseHeaderTimeout: 10 * time.Second,

        // Time to wait between successful reads of response body
        ExpectContinueTimeout: 1 * time.Second,

        // Maximum time for idle connections
        IdleConnTimeout: 90 * time.Second,
    }

    return &http.Client{
        Transport: transport,
        Timeout:   30 * time.Second, // Overall request timeout
    }
}

Advanced Timeout Patterns

Retry Logic with Exponential Backoff

Combine timeouts with retry logic for better resilience:

import (
    "errors"
    "math"
    "math/rand"
    "sync"
    "time"
)

type RetryConfig struct {
    MaxRetries int
    BaseDelay  time.Duration
    MaxDelay   time.Duration
    Timeout    time.Duration
}

func makeRequestWithRetry(url string, config RetryConfig) (*http.Response, error) {
    client := &http.Client{Timeout: config.Timeout}

    for attempt := 0; attempt <= config.MaxRetries; attempt++ {
        ctx, cancel := context.WithTimeout(context.Background(), config.Timeout)
        defer cancel()

        req, err := http.NewRequestWithContext(ctx, "GET", url, nil)
        if err != nil {
            return nil, fmt.Errorf("creating request: %w", err)
        }

        resp, err := client.Do(req)
        if err != nil {
            if attempt == config.MaxRetries {
                return nil, fmt.Errorf("max retries exceeded: %w", err)
            }

            // Calculate exponential backoff delay
            delay := time.Duration(math.Pow(2, float64(attempt))) * config.BaseDelay
            if delay > config.MaxDelay {
                delay = config.MaxDelay
            }

            // Add jitter to prevent thundering herd
            jitter := time.Duration(rand.Intn(1000)) * time.Millisecond
            time.Sleep(delay + jitter)
            continue
        }

        return resp, nil
    }

    return nil, errors.New("unexpected retry loop exit")
}

// Usage example
func exampleRetryUsage() {
    config := RetryConfig{
        MaxRetries: 3,
        BaseDelay:  1 * time.Second,
        MaxDelay:   10 * time.Second,
        Timeout:    15 * time.Second,
    }

    resp, err := makeRequestWithRetry("https://api.example.com/data", config)
    if err != nil {
        fmt.Printf("Request failed: %v\n", err)
        return
    }
    defer resp.Body.Close()

    // Process successful response...
}

Concurrent Requests with Timeout

When making multiple requests concurrently, proper timeout handling becomes even more important:

type Result struct {
    URL      string
    Response *http.Response
    Error    error
}

func makeConcurrentRequests(urls []string, timeout time.Duration) []Result {
    results := make([]Result, len(urls))
    var wg sync.WaitGroup

    client := &http.Client{Timeout: timeout}

    for i, url := range urls {
        wg.Add(1)
        go func(index int, requestURL string) {
            defer wg.Done()

            ctx, cancel := context.WithTimeout(context.Background(), timeout)
            defer cancel()

            req, err := http.NewRequestWithContext(ctx, "GET", requestURL, nil)
            if err != nil {
                results[index] = Result{URL: requestURL, Error: err}
                return
            }

            resp, err := client.Do(req)
            results[index] = Result{URL: requestURL, Response: resp, Error: err}
        }(i, url)
    }

    wg.Wait()
    return results
}

Web Scraping Considerations

When building web scrapers, timeout handling becomes particularly important due to varying server response times. Similar to how to handle timeouts in Puppeteer, Go applications need robust timeout strategies:

type WebScraper struct {
    client *http.Client
    config ScrapingConfig
}

type ScrapingConfig struct {
    RequestTimeout    time.Duration
    ConnectionTimeout time.Duration
    RetryAttempts     int
    RateLimitDelay    time.Duration
}

func NewWebScraper(config ScrapingConfig) *WebScraper {
    transport := &http.Transport{
        DialContext: (&net.Dialer{
            Timeout: config.ConnectionTimeout,
        }).DialContext,
        TLSHandshakeTimeout:   10 * time.Second,
        ResponseHeaderTimeout: 15 * time.Second,
    }

    client := &http.Client{
        Transport: transport,
        Timeout:   config.RequestTimeout,
    }

    return &WebScraper{
        client: client,
        config: config,
    }
}

func (ws *WebScraper) ScrapeURL(url string) ([]byte, error) {
    for attempt := 0; attempt < ws.config.RetryAttempts; attempt++ {
        ctx, cancel := context.WithTimeout(context.Background(), ws.config.RequestTimeout)
        defer cancel()

        req, err := http.NewRequestWithContext(ctx, "GET", url, nil)
        if err != nil {
            return nil, fmt.Errorf("creating request: %w", err)
        }

        // Set headers to avoid detection
        req.Header.Set("User-Agent", "Mozilla/5.0 (compatible; GoScraper/1.0)")

        resp, err := ws.client.Do(req)
        if err != nil {
            if attempt < ws.config.RetryAttempts-1 {
                time.Sleep(ws.config.RateLimitDelay)
                continue
            }
            return nil, fmt.Errorf("request failed after %d attempts: %w", ws.config.RetryAttempts, err)
        }
        defer resp.Body.Close()

        if resp.StatusCode != http.StatusOK {
            if attempt < ws.config.RetryAttempts-1 {
                time.Sleep(ws.config.RateLimitDelay)
                continue
            }
            return nil, fmt.Errorf("HTTP %d: %s", resp.StatusCode, resp.Status)
        }

        body, err := io.ReadAll(resp.Body)
        if err != nil {
            return nil, fmt.Errorf("reading response body: %w", err)
        }

        return body, nil
    }

    return nil, errors.New("unexpected retry loop exit")
}

Error Handling and Timeout Detection

Properly detecting and handling timeout errors is essential:

import (
    "net"
    "os"
    "syscall"
)

func handleHTTPError(err error) {
    if err == nil {
        return
    }

    // Check for context timeout
    if errors.Is(err, context.DeadlineExceeded) {
        fmt.Println("Request timed out (context deadline exceeded)")
        return
    }

    // Check for client timeout
    if os.IsTimeout(err) {
        fmt.Println("Request timed out (client timeout)")
        return
    }

    // Check for network timeouts
    var netErr net.Error
    if errors.As(err, &netErr) && netErr.Timeout() {
        fmt.Println("Network timeout occurred")
        return
    }

    // Check for connection refused
    var syscallErr *os.SyscallError
    if errors.As(err, &syscallErr) && errors.Is(syscallErr.Err, syscall.ECONNREFUSED) {
        fmt.Println("Connection refused by server")
        return
    }

    fmt.Printf("Other error: %v\n", err)
}

func exampleErrorHandling() {
    ctx, cancel := context.WithTimeout(context.Background(), 5*time.Second)
    defer cancel()

    req, _ := http.NewRequestWithContext(ctx, "GET", "https://httpbin.org/delay/10", nil)

    client := &http.Client{Timeout: 3 * time.Second}
    _, err := client.Do(req)

    handleHTTPError(err) // Will detect and report the timeout
}

Testing Timeout Scenarios

Testing your timeout handling is crucial for ensuring reliability:

# Test with a slow endpoint
curl -w "Time: %{time_total}s\n" "https://httpbin.org/delay/5"

# Test connection timeout
curl --connect-timeout 1 "https://example.com:12345"

# Test read timeout
curl --max-time 2 "https://httpbin.org/delay/5"

Example Go test for timeout scenarios:

func TestRequestTimeout(t *testing.T) {
    server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
        time.Sleep(2 * time.Second) // Simulate slow response
        w.WriteHeader(http.StatusOK)
    }))
    defer server.Close()

    client := &http.Client{Timeout: 1 * time.Second}

    _, err := client.Get(server.URL)
    if err == nil {
        t.Error("Expected timeout error, got nil")
    }

    if !os.IsTimeout(err) {
        t.Errorf("Expected timeout error, got: %v", err)
    }
}

Production Best Practices

1. Configuration Management

type HTTPConfig struct {
    Timeout           time.Duration `json:"timeout" env:"HTTP_TIMEOUT" default:"30s"`
    ConnectionTimeout time.Duration `json:"connection_timeout" env:"HTTP_CONNECTION_TIMEOUT" default:"10s"`
    MaxIdleConns      int          `json:"max_idle_conns" env:"HTTP_MAX_IDLE_CONNS" default:"100"`
    MaxConnsPerHost   int          `json:"max_conns_per_host" env:"HTTP_MAX_CONNS_PER_HOST" default:"10"`
}

func (c *HTTPConfig) CreateClient() *http.Client {
    transport := &http.Transport{
        DialContext: (&net.Dialer{
            Timeout: c.ConnectionTimeout,
        }).DialContext,
        MaxIdleConns:        c.MaxIdleConns,
        MaxConnsPerHost:     c.MaxConnsPerHost,
        IdleConnTimeout:     90 * time.Second,
        TLSHandshakeTimeout: 10 * time.Second,
    }

    return &http.Client{
        Transport: transport,
        Timeout:   c.Timeout,
    }
}

2. Monitoring and Metrics

type RequestMetrics struct {
    TotalRequests    int64
    TimeoutCount     int64
    AverageLatency   time.Duration
    MaxLatency       time.Duration
}

func (ws *WebScraper) ScrapeWithMetrics(url string, metrics *RequestMetrics) ([]byte, error) {
    start := time.Now()
    defer func() {
        latency := time.Since(start)
        metrics.TotalRequests++

        if latency > metrics.MaxLatency {
            metrics.MaxLatency = latency
        }

        // Update rolling average (simplified)
        metrics.AverageLatency = (metrics.AverageLatency + latency) / 2
    }()

    data, err := ws.ScrapeURL(url)
    if err != nil && (os.IsTimeout(err) || errors.Is(err, context.DeadlineExceeded)) {
        metrics.TimeoutCount++
    }

    return data, err
}

Common Timeout Values

Here are recommended timeout values for different scenarios:

  • API calls: 10-30 seconds
  • File downloads: 60-300 seconds
  • Database queries: 5-30 seconds
  • Microservice communication: 5-15 seconds
  • Web scraping: 15-60 seconds

When implementing HTTPS certificate handling in Go scraping, remember that TLS handshake timeouts should be separate from overall request timeouts.

Conclusion

Proper timeout handling in Go HTTP requests is essential for building reliable applications. Use context-based timeouts for maximum flexibility, implement retry logic with exponential backoff for resilience, and always handle timeout errors gracefully. When building web scrapers or applications that make frequent HTTP requests, consider implementing comprehensive monitoring and configuration management to ensure optimal performance and reliability.

Remember that timeout values should be tuned based on your specific use case, network conditions, and server response characteristics. Start with conservative values and adjust based on monitoring data and performance requirements.

Try WebScraping.AI for Your Web Scraping Needs

Looking for a powerful web scraping solution? WebScraping.AI provides an LLM-powered API that combines Chromium JavaScript rendering with rotating proxies for reliable data extraction.

Key Features:

  • AI-powered extraction: Ask questions about web pages or extract structured data fields
  • JavaScript rendering: Full Chromium browser support for dynamic content
  • Rotating proxies: Datacenter and residential proxies from multiple countries
  • Easy integration: Simple REST API with SDKs for Python, Ruby, PHP, and more
  • Reliable & scalable: Built for developers who need consistent results

Getting Started:

Get page content with AI analysis:

curl "https://api.webscraping.ai/ai/question?url=https://example.com&question=What is the main topic?&api_key=YOUR_API_KEY"

Extract structured data:

curl "https://api.webscraping.ai/ai/fields?url=https://example.com&fields[title]=Page title&fields[price]=Product price&api_key=YOUR_API_KEY"

Try in request builder

Related Questions

Get Started Now

WebScraping.AI provides rotating proxies, Chromium rendering and built-in HTML parser for web scraping
Icon