Packages

package cache

Ordering
  1. Alphabetic
Visibility
  1. Public
  2. Protected

Package Members

  1. package caffeine
  2. package guava

Type Members

  1. class ConcurrentMapCache[K, V] extends FutureCache[K, V]

    A com.twitter.cache.FutureCache backed by a java.util.concurrent.ConcurrentMap

    A com.twitter.cache.FutureCache backed by a java.util.concurrent.ConcurrentMap

    Any correct implementation should make sure that you evict failed results, and don't interrupt the underlying request that has been fired off. EvictingCache$ and interrupting com.twitter.util.Futures are useful tools for building correct FutureCaches. A reference implementation for caching the results of an asynchronous function with a ConcurrentMap can be found at FutureCache$.fromMap.

  2. abstract class FutureCache[K, V] extends AnyRef

    FutureCache is used to represent an in-memory, in-process, asynchronous cache.

    FutureCache is used to represent an in-memory, in-process, asynchronous cache.

    Every cache operation is atomic.

    Any correct implementation should make sure that you evict failed results, and don't interrupt the underlying request that has been fired off. EvictingCache$ and interrupting com.twitter.util.Futures are useful tools for building correct FutureCaches. A reference implementation for caching the results of an asynchronous function can be found at FutureCache$.default.

  3. abstract class FutureCacheProxy[K, V] extends FutureCache[K, V]

    A proxy for FutureCaches, useful for wrap-but-modify.

Value Members

  1. object AsyncMemoize
  2. object EvictingCache
  3. object FutureCache

    The FutureCache object provides the public interface for constructing FutureCaches.

    The FutureCache object provides the public interface for constructing FutureCaches. Once you've constructed a basic FutureCache, you should almost always wrap it with default. Normal usage looks like:

    val fn: K => Future[V] val map = (new java.util.concurrent.ConcurrentHashMap[K, V]()).asScala val cachedFn: K => Future[V] = FutureCache.default(fn, FutureCache.fromMap(map))

    We typically recommend that you use Caffeine Caches via com.twitter.cache.caffeine.CaffeineCache.

  4. object Refresh

    A single-value asynchronous cache with TTL.

    A single-value asynchronous cache with TTL. The provider supplies the future value when invoked and this class ensures that the provider is not invoked more frequently than the TTL. If the future fails, that failure is not cached. If more than one value is required, a FutureCache backed by a Caffeine cache with TTL may be more appropriate.

    This is useful in situations where a call to an external service returns a value that changes infrequently and we need to access that value often, for example asking a service for a list of features that it supports.

    A non-memoized function like this: def getData(): Future[T] = { ... }

    can be memoized with a TTL of 1 hour as follows: import com.twitter.conversions.DurationOps._ import com.twitter.cache.Refresh val getData: () => Future[T] = Refresh.every(1.hour) { ... }

Ungrouped