Glossary coding Term Page

LRU Cache

Single-process cache that drops the stalest entry once it’s full.

lru-cache #caching#python
Korean version

Aliases

least recently used cache

Related Concepts

Core Idea

LRU (Least Recently Used) caches keep a fixed number of entries. Every time you read or write a key it moves to the “most recent” position; when the cache is full the oldest entry is evicted. Python’s functools.lru_cache decorator provides this behavior for pure functions.

Why It Matters Here

FastAPI exercises demonstrate LRU caches as a zero-dependency way to feel cache hits. Once you need multiple workers or pods, you promote the hot data into a Redis cache so every instance shares the same results.

Posts Mentioning This Concept