Caching Strategies

Optimize API performance and reduce latency with Orbital7's caching solutions.

Caching Overview

How Orbital7 caching works and when to use

What is Caching?

Caching stores frequently requested data to reduce latency and server load. Orbital7 implements edge caching for HTTP, database, and custom rule-based caching strategies.

  • Reduces API response times by 40-70%
  • Supports both HTTP and WebSocket data

Why Use Caching?

Optimize critical API workflows for:

  • High-frequency read operations
  • Static or slowly changing data
  • Content negotiation support

Configuration Guide

Basic Configuration


# API-Specific Caching Configuration
endpoint:
  id: "/user/{id}"

  caching:
    # Enable cache
    enabled: true
    
    # Default TTL if no cache hit
    default_ttl: 300 # seconds

    # Use cache key for versioned content
    cache_keys:
      - headers:
          accept-language: en-US
      - query_strings: ["user_id", "version"]

    # Cache rules using header-based matching
    content_negotiation: true

                        

Best Practices

Proven strategies for optimal performance and reliability

1. Cache Versioning

Use HTTP headers like Accept to handle different cache versions and avoid stale data collisions.

2. Cache Invalidation

Use X-Cache-Invalidate headers to explicitly manage cache expiration for dynamic content.

3. Tiered Strategy

Combine in-memory cache for high-priority items with disk cache for low-frequency queries.

When to Use Caching

For read-heavy APIs with consistent response patterns

Caching reduces load on backend services

Avoid for time-sensitive operations (banking, stock trading)

Data freshness and accuracy required

+

Use with rate limits and request headers

Combine with rate-limiting for security

Caching Layers

Layered caching architecture for maximum performance

Edge Caching

First-line cache at CDN level for global content distribution

Application Caching

Memory-based cache for frequently accessed API resources

Database Caching

Index-based caching of frequent database queries

Custom Cache Policies

Application-specific caching rules and strategies

Performance Impact

Measurable improvements with proper caching

95% Faster

Response times improve in 85-95% of cases