[PATCH 1 of 5] util: add an LRU cache dict

Siddharth Agarwal sid0 at fb.com
Fri Feb 8 16:49:18 CST 2013


On 02/08/2013 10:38 PM, Greg Ward wrote:
> It's a dict...

Yeah -- I fixed that in V2.

> Also, wouldn't _maxsize be a better name?

Maybe.

> Is dequeue the right data structure here? I recently implemented an
> LRU cache using heapq to record the "fetch time" of each entry.
> __getitem__() incremented the age of the whole cache, counted in
> fetches, and set the fetch time of the fetched key to 0. So with every
> fetch, every other key becomes a bit older. I *think* that adds O(1)
> overhead to __getitem__().
>
> Then __setitem__() can lookup the oldest key in O(lg N) time when the
> cache is full.
>
> How does that compare to dequeue? What about memory use?

I don't think it matters -- it's meant for small caches just like the 
lrucachefunc right below, and is in fact a straight no-brainer copy of 
that algorithm.

I guess I could add tests to it, but lrucachefunc isn't unit tested 
either...


More information about the Mercurial-devel mailing list