Kaydet (Commit) 02566ec8 authored tarafından Raymond Hettinger's avatar Raymond Hettinger

Adopt more descriptive attribute names as suggested on python-dev.

üst e9a4de51
......@@ -47,12 +47,12 @@ The :mod:`functools` module defines the following functions:
results, the positional and keyword arguments to the function must be
hashable.
The wrapped function is instrumented with two attributes, :attr:`hits`
and :attr:`misses` which count the number of successful or unsuccessful
The wrapped function is instrumented with two attributes, :attr:`cache_hits`
and :attr:`cache_misses` which count the number of successful or unsuccessful
cache lookups. These statistics are helpful for tuning the *maxsize*
parameter and for measuring the cache's effectiveness.
The wrapped function also has a :attr:`clear` attribute which can be
The wrapped function also has a :attr:`cache_clear` attribute which can be
called (with no arguments) to clear the cache.
The original underlying function is accessible through the
......
......@@ -85,17 +85,17 @@ New, Improved, and Deprecated Modules
return c.fetchone()[0]
To help with choosing an effective cache size, the wrapped function is
instrumented with two attributes *hits* and *misses*::
instrumented with two attributes *cache_hits* and *cache_misses*::
>>> for name in user_requests:
... get_phone_number(name)
>>> print(get_phone_number.hits, get_phone_number.misses)
>>> print(get_phone_number.cache_hits, get_phone_number.cache_misses)
4805 980
If the phonelist table gets updated, the outdated contents of the cache can be
cleared with::
>>> get_phone_number.clear()
>>> get_phone_number.cache_clear()
(Contributed by Raymond Hettinger)
......
......@@ -142,23 +142,23 @@ def lru_cache(maxsize=100):
with lock:
result = cache[key]
cache_renew(key) # record recent use of this key
wrapper.hits += 1
wrapper.cache_hits += 1
except KeyError:
result = user_function(*args, **kwds)
with lock:
cache[key] = result # record recent use of this key
wrapper.misses += 1
wrapper.cache_misses += 1
if len(cache) > maxsize:
cache_popitem(0) # purge least recently used cache entry
return result
def clear():
def cache_clear():
"""Clear the cache and cache statistics"""
with lock:
cache.clear()
wrapper.hits = wrapper.misses = 0
wrapper.cache_hits = wrapper.cache_misses = 0
wrapper.hits = wrapper.misses = 0
wrapper.clear = clear
wrapper.cache_hits = wrapper.cache_misses = 0
wrapper.cache_clear = cache_clear
return wrapper
return decorating_function
......@@ -207,8 +207,8 @@ def compile(pattern, flags=0):
def purge():
"Clear the regular expression caches"
_compile_typed.clear()
_compile_repl.clear()
_compile_typed.cache_clear()
_compile_repl.cache_clear()
def template(pattern, flags=0):
"Compile a template pattern, returning a pattern object"
......
......@@ -508,21 +508,21 @@ class TestLRU(unittest.TestCase):
actual = f(x, y)
expected = orig(x, y)
self.assertEquals(actual, expected)
self.assert_(f.hits > f.misses)
self.assertEquals(f.hits + f.misses, 1000)
self.assert_(f.cache_hits > f.cache_misses)
self.assertEquals(f.cache_hits + f.cache_misses, 1000)
f.clear() # test clearing
self.assertEqual(f.hits, 0)
self.assertEqual(f.misses, 0)
f.cache_clear() # test clearing
self.assertEqual(f.cache_hits, 0)
self.assertEqual(f.cache_misses, 0)
f(x, y)
self.assertEqual(f.hits, 0)
self.assertEqual(f.misses, 1)
self.assertEqual(f.cache_hits, 0)
self.assertEqual(f.cache_misses, 1)
# Test bypassing the cache
self.assertIs(f.__wrapped__, orig)
f.__wrapped__(x, y)
self.assertEqual(f.hits, 0)
self.assertEqual(f.misses, 1)
self.assertEqual(f.cache_hits, 0)
self.assertEqual(f.cache_misses, 1)
# test size zero (which means "never-cache")
@functools.lru_cache(0)
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment