-
-
Notifications
You must be signed in to change notification settings - Fork 31.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Standardise (and publish?) cache handling in standard library #53642
Comments
The standard library has several cache implementations (e.g. in re, fnmatch and ElementTree) with different cache size limiting strategies. These should be standardised and possibly even exposed for general use. Refer to python-dev discussion: |
With Raymond adding functools.lru_cache and functools.lfu_cache, it should be possible to use those for the various caches in the standard library. My only point of concern is that the standard lib caches tend to allow dynamic modification of the max cache size, while the new cache decorators appear to be fixed at the size specified when defining the decorator. |
I will take a look at the various caches and see if some of their features can be consolidated. It is okay if some need to have their own strategies or situation specific approaches, but you're right that common features should be looked at to see if they make sense in the functools caches. Am not sure that dynamically changing the maxsize is a generally useful feature (most are set to the largest size that makes sense and not revisited latter), but I will take a look to whether that complicates the code (if it's simple and fast, it may be worth doing). |
Yeah, I'm not sure the dynamic resizing makes sense in general. I was just pointing it out as something supported by the existing caches that could complicate a consolidation effort. |
Applied the lru_cache() to fnmatch and re. I did find a simple way to dynamically resize the maxcache, but did not apply it yet. Will look at more applications to see if it is really needed. Nick, thanks for the great ideas. These changes simplified the code where they were applied and resulted in a smarter caching strategy. |
On Mon, Aug 9, 2010 at 2:29 PM, Raymond Hettinger
The reason I mentioned the dynamic sizing specifically was that the |
I was thinking about the problem of developers wanting a different cache size than that provided in standard lib modules. ISTM that now we've offered caching abilities in functools, a developer can easily add another layer of cache around any API they are interested in. For example, if someone is using thousands of recurring fnmatch patterns, they can write something like: @functools.lfu_cache(maxsize=10000) # custom fat cache IMO, this beats adding caching controls to lots of APIs that should be focused only on their problem domain. IOW, it is probably not a good idea to add purge() and cache_resize() functions to multiple modules throughout the standard lib. ISTM, we should just provide basic caching with reasonable space consumption (i.e. not huge) that gives improvements to common use cases (like I've done with the fnmatch and re module) and let programmers with unusual cases add their own caching options rather that be tied into our choice of lru vs lfu or whatnot. |
On Mon, Aug 9, 2010 at 3:11 PM, Raymond Hettinger
A very good point! Perhaps we should note that somewhere? I'm not sure Going the other way (using a smaller, or no, cache), perhaps in |
Great minds think alike. I was just about to propose that functools.wraps add a standard attribute to point at the underlying function (on the theory that objects should be introspectable). This would allow a standard way to get to the underlying unwrapped functions. |
After discussion with RDM on IRC, I’m opening a new report to track this feature request separately. (It’s also a dependency of this bug.) |
Have we had any luck getting this to play nicely with the buildbots yet? (I lost track of where the last checkin got to). The necessary Modules/Setup change to adjust when _collections is built should have propagated through by now. |
Raymond, out of curiosity, can you tell why you removed lfu_cache? |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
The text was updated successfully, but these errors were encountered: