From ffa8eb9a245ab20be2d402e910007cc3f4ab7635 Mon Sep 17 00:00:00 2001 From: Alex Peck Date: Thu, 13 Oct 2022 20:15:17 -0700 Subject: [PATCH 1/2] rme --- BitFaster.Caching/ReadMe.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/BitFaster.Caching/ReadMe.md b/BitFaster.Caching/ReadMe.md index d34db786..f4f3acba 100644 --- a/BitFaster.Caching/ReadMe.md +++ b/BitFaster.Caching/ReadMe.md @@ -4,12 +4,12 @@ High performance, thread-safe in-memory caching primitives for .NET. ## ConcurrentLru -`ConcurrentLru` is a light weight drop in replacement for `ConcurrentDictionary`, but with bounded size enforced by the TU-Q eviction policy (similar to [2Q](https://www.vldb.org/conf/1994/P439.PDF)). There are no background threads, no global locks, concurrent throughput is high, lookups are fast and hit rate outperforms a pure LRU in all tested scenarios. +`ConcurrentLru` is a light weight drop in replacement for `ConcurrentDictionary`, but with bounded size enforced by the TU-Q eviction policy (based on [2Q](https://www.vldb.org/conf/1994/P439.PDF)). There are no background threads, no global locks, concurrent throughput is high, lookups are fast and hit rate outperforms a pure LRU in all tested scenarios. Choose a capacity and use just like `ConcurrentDictionary`, but with bounded size: ```csharp -int capacity = 666; +int capacity = 128; var lru = new ConcurrentLru(capacity); var value = lru.GetOrAdd("key", (key) => new SomeItem(key)); @@ -17,12 +17,12 @@ var value = lru.GetOrAdd("key", (key) => new SomeItem(key)); ## ConcurrentLfu -`ConcurrentLfu` is a drop in replacement for `ConcurrentDictionary`, but with bounded size enforced by the [W-TinyLFU eviction policy](https://arxiv.org/pdf/1512.00727.pdf). `ConcurrentLfu` has near optimal hit rate. Reads and writes are buffered then replayed asynchronously to mitigate lock contention. +`ConcurrentLfu` is a drop in replacement for `ConcurrentDictionary`, but with bounded size enforced by the [W-TinyLFU admission policy](https://arxiv.org/pdf/1512.00727.pdf). `ConcurrentLfu` has near optimal hit rate and high scalability. Reads and writes are buffered then replayed asynchronously to mitigate lock contention. Choose a capacity and use just like `ConcurrentDictionary`, but with bounded size: ```csharp -int capacity = 666; +int capacity = 128; var lfu = new ConcurrentLfu(capacity); var value = lfu.GetOrAdd("key", (key) => new SomeItem(key)); From 0fa952e96ec8a9514d0fe900af8238972a4ca4f8 Mon Sep 17 00:00:00 2001 From: Alex Peck Date: Thu, 13 Oct 2022 20:17:44 -0700 Subject: [PATCH 2/2] build --- .github/workflows/gate.yml | 2 -- 1 file changed, 2 deletions(-) diff --git a/.github/workflows/gate.yml b/.github/workflows/gate.yml index 2acb6275..2b3f1a56 100644 --- a/.github/workflows/gate.yml +++ b/.github/workflows/gate.yml @@ -2,10 +2,8 @@ name: Build on: push: - paths-ignore: [ '**.md' ] branches: [ main ] pull_request: - paths-ignore: [ '**.md' ] branches: [ main ] jobs: