Module Resource_cache.Make
Cache.Make
creates a cache module that exposes a simple with_
interface over its resources. The cache has the following properties:
Resource reuse: When a resource r
is opened, it will remain open until one of the following:
f r
raised an exception wheref
was a function passed towith_
r
has been idle foridle_cleanup_after
r
has been usedmax_resource_reuse
timesclose_and_flush
has been called on the cache
When a resource is closed, either because of one of the above conditions, or because it was closed by other means, it no longer counts towards the limits.
Limits: The cache respects the following limits:
- No more than
max_resources
are open simultaneously - No more than
max_resources_per_id
are open simultaneously for a given id (args)
Parameters
R : Resource.S
Signature
val init : config:Config.t -> log_error:(string -> unit) -> common_args -> t
val status : t -> Status.t
val config : t -> Config.t
val with_ : ?open_timeout:Core_kernel.Time_ns.Span.t -> ?give_up:unit Async_kernel.Deferred.t -> t -> key -> f:(resource -> 'a Async_kernel.Deferred.t) -> 'a Async_kernel.Deferred.Or_error.t
with_ t key ~f
callsf resource
whereresource
is either:1) An existing cached resource that was opened with key' such that
R.Key.equal key key'
2) A newly opened resource created byR.open_ key common_args
, respecting the limits oft.config
Returns an error if:
- the cache is closed
R.open_
returned an error- no resource is obtained before
give_up
is determined
If
f
raises, the exception is not caught, but theresource
will be closed and theCache
will remain in a working state (no resources are lost).
val with_' : ?open_timeout:Core_kernel.Time_ns.Span.t -> ?give_up:unit Async_kernel.Deferred.t -> t -> key -> f:(resource -> 'a Async_kernel.Deferred.t) -> [ `Ok of 'a | `Gave_up_waiting_for_resource | `Error_opening_resource of Core_kernel.Error.t | `Cache_is_closed ] Async_kernel.Deferred.t
Like
with_
but classify the different errors
val with_any : ?open_timeout:Core_kernel.Time_ns.Span.t -> ?give_up:unit Async_kernel.Deferred.t -> ?load_balance:bool -> t -> key list -> f:(resource -> 'a Async_kernel.Deferred.t) -> (key * 'a) Async_kernel.Deferred.Or_error.t
Like
with_
andwith_'
exceptf
is run on the first matching available resource (or the first resource that has availability to be opened).Preference is given towards resources earlier in the list, unless
~load_balance:true
has been specified, in which case preference is given to ensure that load is approximately balanced. The key with the least number of open connections will be favored.
val with_any' : ?open_timeout:Core_kernel.Time_ns.Span.t -> ?give_up:unit Async_kernel.Deferred.t -> ?load_balance:bool -> t -> key list -> f:(resource -> 'a Async_kernel.Deferred.t) -> [ `Ok of key * 'a | `Error_opening_resource of key * Core_kernel.Error.t | `Gave_up_waiting_for_resource | `Cache_is_closed ] Async_kernel.Deferred.t
val with_any_loop : ?open_timeout:Core_kernel.Time_ns.Span.t -> ?give_up:unit Async_kernel.Deferred.t -> ?load_balance:bool -> t -> key list -> f:(resource -> 'a Async_kernel.Deferred.t) -> [ `Ok of key * 'a | `Error_opening_all_resources of (key * Core_kernel.Error.t) list | `Gave_up_waiting_for_resource | `Cache_is_closed ] Async_kernel.Deferred.t
Tries
with_any'
in a loop (removing args that have open errors) until receiving an`Ok
, or until it has failed to open all resources inargs_list
.
val close_started : t -> bool
val close_finished : t -> unit Async_kernel.Deferred.t
val close_and_flush : t -> unit Async_kernel.Deferred.t
Close all currently open resources and prevent the creation of new ones. All subsequent calls to
with_
andimmediate
fail with`Cache_is_closed
. Any jobs that are waiting for a connection will return with`Cache_is_closed
. The returnedDeferred.t
is determined when all jobs have finished running and all resources have been closed.