functools.cache and lru_cache
Memoize expensive function calls with functools.cache and lru_cache for automatic result caching.
from functools import cache, lru_cache
# Unlimited cache (Python 3.9+)
@cache
def fibonacci(n: int) -> int:
if n < 2:
return n
return fibonacci(n - 1) + fibonacci(n - 2)
# LRU cache with max size
@lru_cache(maxsize=256)
def fetch_config(key: str) -> dict:
# Simulated expensive lookup
configs = {
"db": {"host": "localhost", "port": 5432},
"cache": {"host": "localhost", "port": 6379},
}
return configs.get(key, {})
# Cache info
print(fibonacci(100))
print(fibonacci.cache_info())
# Clear cache
fetch_config.cache_clear()Use Cases
- Recursive algorithms
- API response caching
- Expensive computations
Tags
Related Snippets
Similar patterns you can reuse in the same workflow.
Materialized View with Auto-Refresh
Create and maintain materialized views for expensive aggregate queries with concurrent refresh support.
Generator Pipeline for Data Processing
Chain generators to build memory-efficient data processing pipelines for large files and streams.
LRU Cache with TTL Support
Extend functools.lru_cache with time-based expiration for caching expensive function calls with staleness control.
Redis Cache Get/Set Helper
Type-safe Redis cache wrapper with automatic JSON serialization, TTL support, and cache-aside pattern.