Proxy Caching

TTL governs the refresh rate of cached content, which is critical for ensuring that clients aren’t served outdated content. A TTL of 30 seconds means content older than 30 seconds is deemed expired and will be refreshed on subsequent requests. TTL configurations should be set differently based on the type of the content the upstream service is serving.

  • Static data that is rarely updated can have longer TTL

  • Dynamic data should use shorter TTL to avoid serving outdated data

Kong Gateway follows RFC-7234 section 5.2 for cached controlled operations. See the specification and the Proxy Cache plugin for more details on TTL configurations.

The following tutorial walks through managing proxy caching across various aspects in Kong Gateway.

Prerequisites

This chapter is part of the Get Started with Kong series. For the best experience, it is recommended that you follow the series from the beginning.

Start with the introduction Get Kong, which includes a list of prerequisites and instructions for running a local Kong Gateway.

Step two of the guide, , includes instructions for installing a mock service used throughout this series.

If you haven’t completed these steps already, complete them before proceeding.

Installing the plugin globally means every proxy request to Kong Gateway will potentially be cached.

  1. The Proxy Cache plugin is installed by default on Kong Gateway, and can be enabled by sending a request to the plugins object on the Admin API:

    If configuration was successful, you will receive a 201 response code.

    This Admin API request configured a Proxy Cache plugin for all GET requests that resulted in response codes of 200 and response Content-Type headers that equal application/json; charset=utf-8. cache_ttl instructed the plugin to flush values after 30 seconds.

    The final option config.strategy=memory specifies the backing data store for cached responses. More information on strategy can be found in the for the Proxy Cache plugin.

  2. Validate

    You can check that the Proxy Cache plugin is working by sending GET requests and examining the returned headers. In step two of this guide, services and routes, you setup a /mock route and service that can help you see proxy caching in action.

    First, make an initial request to the /mock route. The Proxy Cache plugin returns status information headers prefixed with X-Cache, so use grep to filter for that information:

      On the initial request, there should be no cached responses, and the headers will indicate this with X-Cache-Status: Miss.

      1. X-Cache-Status: Miss

      Within 30 seconds of the initial request, repeat the command to send an identical request and the headers will indicate a cache Hit:

      The X-Cache-Status headers can return the following cache results:

    Service level proxy caching

    1. curl -X POST http://localhost:8001/services/example_service/plugins \
    2. --data "name=proxy-cache" \
    3. --data "config.request_method=GET" \
    4. --data "config.response_code=200" \
    5. --data "config.content_type=application/json; charset=utf-8" \
    6. --data "config.cache_ttl=30" \
    7. --data "config.strategy=memory"

    The Proxy Caching plugin can be enabled for specific routes. The request is the same as above, but the request is sent to the route URL:

    1. curl -X POST http://localhost:8001/routes/example_route/plugins \
    2. --data "name=proxy-cache" \
    3. --data "config.request_method=GET" \
    4. --data "config.cache_ttl=30" \
    5. --data "config.strategy=memory"

    Consumer level proxy caching

    In Kong Gateway, consumers are an abstraction that defines a user of a service. Consumer-level proxy caching can be used to cache responses per consumer.

    1. Create a consumer

    Consumers are created using the consumer object in the Admin API.

    1. Enable caching for the consumer
    1. curl -X POST http://localhost:8001/consumers/sasha/plugins \
    2. --data "name=proxy-cache" \
    3. --data "config.request_method=GET" \
    4. --data "config.response_code=200" \
    5. --data "config.content_type=application/json; charset=utf-8" \
    6. --data "config.cache_ttl=30" \
    7. --data "config.strategy=memory"

    Manage cached entities

    The Proxy Cache plugin supports administrative endpoints to manage cached entities. Administrators can view and delete cached entities, or purge the entire cache by sending requests to the Admin API.

    To retrieve the cached entity, submit a request to the Admin API /proxy-cache endpoint with the X-Cache-Key value of a known cached value. This request must be submitted prior to the TTL expiration, otherwise the cached entity has been purged.

    For example, using the response headers above, pass the X-Cache-Key value of c9e1d4c8e5fd8209a5969eb3b0e85bc6 to the Admin API:

      A response with 200 OK will contain full details of the cached entity.

      See the Proxy Cache plugin documentation for the full list of the Proxy Cache specific Admin API endpoints.


      Previous

      Next Key Authentication