Last Updated: December 17, 2025
Your application has a configuration cache. Hundreds of threads read from it every second to get feature flags, rate limits, and service URLs. Occasionally, once every few minutes, a background thread updates the cache with fresh values from the database.
You add a mutex to protect the cache. Now only one thread can access it at a time. Reads that could safely happen in parallel are serialized. Your 99th percentile latency spikes. Users complain.
The problem is that a regular mutex treats all access equally. But reads and writes are fundamentally different. Multiple threads can safely read the same data simultaneously. Only writes require exclusive access.
The Read-Write Lock Pattern exploits this asymmetry. It allows multiple concurrent readers but only one writer at a time. When no one is writing, readers proceed in parallel. When a writer needs access, it waits for readers to finish and blocks new readers until it's done.
In this chapter, we'll explore: