Conversation
|
A goal of the new caching abstraction (#211) was to leave Getter and Reactor as unaware of the caching features as possible. I believe there is a lot of value in keeping Nuclear's API footprint as small as possible, while exposing powerful APIs that can apply to many use cases. There's less code to maintain and fewer concepts for newcomers to learn. I imagined that these caching options to always/never cache globally or at Getter level would be implemented within a Cache instance as opposed to being directly integrated. For example, you could create a class that holds a mapping of Getters to their custom cache settings, like var reactor = new Reactor({
cache: new GranularCache({
cache: "always",
getters: Immutable.Map([
[someGetter, "never"], // using [k, v] tuple constructor because key is object
]),
}),
})Inside I think the main changes that are here that we should keep are the caching improvements around removing observers. How does that sound? |
|
@loganlinn @jordangarcia up to you guys what you want to do with this. it'd be a shame to not get this in prod soon as resolves a good chunk of memory leaks we see on the optimizely account |
@jordangarcia @loganlinn
This includes the changes for the cache abstraction Logan authored and the unwatch/config options I added. API changes include:
cache. This expects a cache constructor that conforms to the interface specified in Logan's earlier work.maxItemsToCache. If not specified, this will default to 1000. If a number > 0, the LRU cache will cache up tomaxItemsToCachevalues. If non numeric ornull, we will default to using aBasicCachewith no item limituseCachethat defaults to true. If false, no cache values will be set.Getterpseudo-constructor that can be used to addcacheandcacheKeyoptions for getters. If specified,cacheKeywill be used as the getter cache key rather than a getter reference. Valid values forcacheare['always', 'default', 'never']. If'always'or'never'are specified, these options will be favored over the globaluseCachesetting.