Efficient Caching in Python: From Local to External Solutions

Track:
DevOps, Cloud, Scalable Infrastructure
Type:
Talk
Level:
beginner
Duration:
30 minutes

Abstract

We all know that caching speeds up database queries, but are we aware which caching technique is right for our specific use case? Did we know that even simple, seemingly obvious code can be drastically improved with the right caching strategy? Improper or excessive use of caching can however introduce unnecessary complexity—impacting deployment, performance, scalability, and maintenance.

In this talk, we will address these challenges by exploring Python-specific caching strategies and the decision-making process behind transitioning from local in-memory caching to external solutions. We will focus on optimizing applications without over-complicating things.

Starting with a quick introduction on caching fundamentals, we will dive into simple Pythonic ways to leverage local in-memory caching with functools.lru_cache, cachetools, and joblib, and explore how cache warming, expiry, and cache invalidation can really improve performance. After comparing multiple caching strategies, we will discuss external caching and when it’s time to use it. Through a live demo, we’ll see how to set up Redis and integrate it into a Python app to scale up our caching strategy effectively.

Through this talk attendees will:

  • Learn how to implement efficient internal caching using Python’s built-in libraries.
  • Discover practical techniques for cache invalidation, expiry, and cache warming to ensure optimal cache performance.
  • Understand when to scale with Redis (or other external caching solutions) and how to integrate them into Python projects.
  • Be equipped to choose the right caching strategy for different use cases, ensuring faster and more scalable Python applications.