Engineering Playbook
Caching

Caching Strategies

Cache-aside, read-through, write-through, and write-back patterns.

Caching Strategies

Caching is the single most effective performance optimization. A cache miss is cheap; a cache hit is priceless.

Cache-Aside (Lazy Loading)

The most common pattern. The application is responsible for loading data into the cache.

async function getUser(id) {
  // 1. Check cache first
  let user = await cache.get(`user:${id}`);
  
  if (user) {
    return user; // Cache hit
  }
  
  // 2. Cache miss - load from database
  user = await db.users.find({ id });
  
  // 3. Populate cache for next request
  await cache.set(`user:${id}`, user, { ttl: 3600 });
  
  return user;
}

Pros: Simple, data only loaded when needed Cons: Cache miss penalty (multiple requests can thrash the database)


Read-Through

The cache layer handles loading data from the database automatically.

// Cache layer automatically calls this on miss
const cache = new ReadThroughCache({
  loader: async (key) => {
    return await db.users.find({ id: key.split(':')[1] });
  },
  ttl: 3600
});

// Application code becomes simpler
async function getUser(id) {
  return await cache.get(`user:${id}`); // Cache handles miss automatically
}

Pros: Simpler application code, consistent cache behavior Cons: Less flexibility, cache becomes tightly coupled to data source


Write-Through

Data is written to both cache and database simultaneously.

async function updateUser(id, data) {
  // Write to database first
  await db.users.update({ id }, data);
  
  // Then update cache
  await cache.set(`user:${id}`, updatedData);
  
  return updatedData;
}

Pros: Cache and database always synchronized Cons: Higher write latency, cache never contains stale data


Write-Behind (Write-Back)

Write to cache immediately, asynchronously persist to database.

async function updateUser(id, data) {
  // Update cache immediately (fast response)
  await cache.set(`user:${id}`, data);
  
  // Queue for async database write
  await queue.add('update-user', { id, data });
  
  return data; // Return before database write
}

Pros: Extremely fast writes, can batch database operations Cons: Risk of data loss if cache fails, complex recovery logic


Cache Invalidation Strategies

TTL (Time-To-Live)

Simplest approach - data expires after fixed time.

await cache.set('user:123', data, { ttl: 3600 }); // Expires in 1 hour

Event-Driven Invalidation

Invalidate cache when data changes.

// When user updates
await cache.delete(`user:${id}`);
await cache.delete(`user:${id}:posts`); // Related data

Tag-Based Invalidation

Group related cache entries.

await cache.set('user:123', data, { tags: ['user', 'premium'] });

// Invalidate all premium user caches
await cache.invalidateByTag('premium');

Cache Stampede Prevention

When a hot cache key expires, multiple requests can simultaneously try to rebuild it, overwhelming your database.

Solutions:

  1. Locking: First request acquires lock, others wait
  2. Probabilistic Early Expires: Randomly expire cache before TTL
  3. Stale-While-Revalidate: Serve stale data while refreshing

Choosing the Right Strategy

PatternBest ForComplexityConsistency
Cache-AsideRead-heavy workloadsLowEventual
Read-ThroughSimplifying codeMediumEventual
Write-ThroughCritical data integrityHighStrong
Write-BehindWrite-heavy workloadsVery HighWeak

Rule of Thumb

Start with Cache-Aside for reads and Write-Through for writes. It's the most predictable combination.