Skip to content

[Feature Request] Add TryRemove(KeyValuePair<K, V>) overload #371

@enioluwas

Description

@enioluwas

Essentially what I am trying to achieve here is to "try to remove this key only if it has this value". To give some background on why this is needed, my use case for this library is to have an LRU cache for resource mappings done by an external server. Every resource mapping comes with an expiration date, so at some point the value has to be refreshed from the server. I am using the AsAsyncCache and WithAtomicGetOrAdd options to cache the result of the network call. Each value inserted into the cache has an ExpiryTime property. After calling GetOrAddAsync, if the ExpiryTime has passed, the existing value has to be removed and replaced with a new one. Here is the basic flow with existing APIs:

var entry = await this.cache.GetOrAddAsync(key, async (k) => ...);
if (entry.ExpiryTime < DateTimeOffset.UtcNow)
{
    this.logger.Log(...);
    this.cache.TryRemove(key);
    entry = await this.cache.GetOrAddAsync(key, async (k) => ...);
}

The problem with TryRemove(K) in this scenario is that multiple threads will access this logic, so there would be a race between TryRemove and GetOrAddAsync. You can correct me if I'm wrong on how those are synchronized, but what I imagine this would mean that it's possible for one thread to see an expired entry, remove it, call GetOrAddAsync to cache and return a new value, and then another thread which had also previously read the expired value could call TryRemove, remove the newly added value, and make another expensive network call in the second GetOrAddAsync. In reality this would be rare, but when it does happen it would artificially lower the cache hit rate.

So the solution here would be to have an atomic operation to "remove this key if it has the old value". With the new API it would be a guarantee in that situation that the network call is only done by one thread:

var entry = await this.cache.GetOrAddAsync(key, async (k) => ...);
if (entry.ExpiryTime < DateTimeOffset.UtcNow)
{
    this.logger.Log(...);
    this.cache.TryRemove(new KeyValuePair<K, V>(key, entry));
    entry = await this.cache.GetOrAddAsync(key, async (k) => ...);
}

Looking at the ConcurrentLruCore implementation for TryRemove(K), it looks like this should be a relatively simple addition, because it would just skip the enclosing while loop in the existing TryRemove logic and go straight to the atomic remove step. But I'm not sure if there are other factors to consider.

I'm sure there are workarounds to this today, but based on my research I would need a custom IItemPolicy for my cache entry type which customizes ShouldDiscard, and would need to chain together my own async cache with atomic get-or-add, rather than being able to use the ConcurrentLruBuilder. So having this API feels like the simplest option. Let me know what you think. Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions