1. LRU cache (leas recent used)
If resource A is request by client
- if A is in cache, return
- if A is not in cache and the cache still have empty slot, read A from disk to cache and return
- if A is not in cache and cache is full, evict the least recent used slot, read A from disk to cache and return
Implement of LRU: cache and double linked list
2. Eviction policy
- random replacement (in ARM architecture)
- LRU
- least frequently used (keep a count for each entry)
- windowed LFU
3. Concurrency: 2 clients want write to the same entry
- lock / section lock
- commit log
4. Distributed cache:
value saves reference to other machine
reference
Memcached http://www.slideshare.net/oemebamo/introduction-to-memcached
https://en.wikipedia.org/wiki/Cache_algorithms
No comments:
Post a Comment