Manual Instrumentation for Cache Module

To monitor your caches Sentry provides some auto instrumentation for popular Python caching setups (like Django, Redis, and memcached (coming soon)). But in cache you use a custom caching solution that has not auto instrumentatin, you can manually instrument your caching solution to get a look into how your caching solution is performing.

You need to create a span when you put something into the cache and create a second span when you fetch something out of the cache. By adding some additional information to those cache Sentry will show you how your caching system is performing.

Always make sure that there is a transaction running when you create the spans. If you are using a web framework then those transactions are created for you. See Performance Monitoring for more information.

For detailed information about data that can be set, see the Cache Module Developer Specification.

If you use Django, Redis, memcached (comming soon) then you do not need manual instrumentation. See the linked docs on how to enable automatic cache instrumentation.

If you have caching solution not mentioned above you need to do the following:

Once this is done, Sentry's Python SDK captures all unhandled exceptions and transactions.

Copied
import sentry_sdk

sentry_sdk.init(
    dsn="https://examplePublicKey@o0.ingest.sentry.io/0",

    # Enable performance monitoring
    enable_tracing=True,
)

If the cache you’re using isn’t supported by Opt-In instrumentation mentioned above, you can use the Custom Instrumentation instructions below to emit cache spans:

  1. Set the cache value with whatever cache library you happen to be using.
  2. Wrap the part of your application that uses the cached value with with sentry_sdk.start_span(...)
  3. Set op to cache.set.
  4. Set cache.item_size to an integer representing the size of the cached item.

(The steps described above are documented in the snippet.)

Copied
import my_caching
import sentry_sdk

key = "myCacheKey123"
value = "The value I want to cache."

with sentry_sdk.start_span(op="cache.set") as span:
    # Set a key in your caching using your custom caching solution
    my_caching.set(key, value)

    # Describe the cache server you are accessing
    span.set_data("network.peer.address", "cache.example.com/supercache")
    span.set_data("network.peer.port", 9000)

    # Add the key you want to set
    span.set_data("cache.key", key)

    # Add the size of the value you stored in the cache
    span.set_data("cache.item_size", len(value))  # Warning: if value is very big this could use lots of memory

If the cache you’re using isn’t supported by Opt-In instrumentation mentioned above, you can use the Custom Instrumentation instructions below to emit cache spans:

  1. Fetch the cached value from whatever cache library you happen to be using.
  2. Wrap the part of your application that uses the cached value with with sentry_sdk.start_span(...)
  3. Set op to cache.get.
  4. Set cache.hit to a boolean value representing whether the value was successfully fetched from the cache or not.
  5. Set cache.item_size to an integer representing the size of the cached item.

(The steps described above are documented in the snippet.)

Copied
import my_caching
import sentry_sdk

key = "myCacheKey123"
value = None

with sentry_sdk.start_span(op="cache.get") as span:
    # Get a key from your caching solution
    value = my_caching.get(key)

    # Describe the cache server you are accessing
    span.set_data("network.peer.address", "cache.example.com/supercache")
    span.set_data("network.peer.port", 9000)

    # Add the key you just retrieved from the cache
    span.set_data("cache.key", key)

    if value is not None:
        # If you retrieved a value, the cache was hit
        span.set_data("cache.hit", True)

        # Optionally also add the size of the value you retrieved
        span.set_data("cache.item_size", len(value))
    else
        # If you could not retrieve a value, it was a miss
        span.set_data("cache.hit", False)

That's it. If you have those spans in place head over to Performance > Cache on Sentry.io to see how your cache is doing.

Help improve this content
Our documentation is open source and available on GitHub. Your contributions are welcome, whether fixing a typo (drat!) or suggesting an update ("yeah, this would be better").