memoization

How to add fields that only cache something to ADT?

≯℡__Kan透↙ 提交于 2019-12-03 10:17:48
Often I'm in the need of adding fields to an ADT that only memoize some redundant information. But I haven't figured out completely how to do it nicely and efficiently. The best way to show the problem is to make an example. Suppose we're working with untyped lambda terms: type VSym = String data Lambda = Var VSym | App Lambda Lambda | Abs VSym Lambda And from time to time we need to compute the set of free variables of a term: fv :: Lambda -> Set VSym fv (Var v) = Set.singleton v fv (App s t) = (fv s) `Set.union` (fv t) fv (Abs v t) = v `Set.delete` (fv t) Soon we realize that repeated

What can be done to speed up this memoization decorator?

谁说胖子不能爱 提交于 2019-12-03 08:47:47
What I want is a memoization decorator that: can memoize instance methods with both arguments and keyword arguments has a cache that can be cleared (globally) with one call (vs. this one that uses a per-function cache: python resettable instance method memoization decorator ) is reasonably efficient I've tweaked an example I saw and came up with the following: import functools class Memoized(object): """Decorator that caches a function's return value each time it is called. If called later with the same arguments, the cached value is returned, and not re-evaluated. """ __cache = {} def __init_

Is there an established memoize on-disk decorator for python?

泄露秘密 提交于 2019-12-03 07:36:08
I have been searching a bit for a python module that offers a memoize decorator with the following capabilities: Stores cache on disk to be reused among subsequent program runs. Works for any pickle-able arguments, most importantly numpy arrays. (Bonus) checks whether arguments are mutated in function calls. I found a few small code snippets for this task and could probably implement one myself, but I would prefer having an established package for this task. I also found incpy , but that does not seem to work with the standard python interpreter. Ideally, I would like to have something like

Efficient memoization in Python

白昼怎懂夜的黑 提交于 2019-12-03 07:04:14
问题 I have some task to solve and the most important part at the moment is to make the script as time-efficient as possible. One of the elements I am trying to optimize is memoization within one of the functions. So my question is: Which of the following 3-4 methods is the most efficient / fastest method of implementing memoization in Python? I have provided code only as an example - if one of the methods is more efficient, but not in the case I mentioned, please share what you know. Solution 1 -

How to memoize **kwargs?

非 Y 不嫁゛ 提交于 2019-12-03 06:59:46
问题 I haven't seen an established way to memoize a function that takes key-word arguments, i.e. something of type def f(*args, **kwargs) since typically a memoizer has a dict to cache results for a given set of input parameters, and kwargs is a dict and hence unhashable. I have tried, following discussions here, using (args, frozenset(kwargs.items())) as key to the cache dict , but this only works if the values in kwargs are hashable. Furthermore, as pointed out in answers below is that frozenset

Pandas memoization

。_饼干妹妹 提交于 2019-12-03 06:24:24
I have lengthy computations which I repeat many times. Therefore, I would like to use memoization (packages such as jug and joblib ), in concert with Pandas . The problem is whether the package would memoize well Pandas DataFrames as method arguments. Has anyone tried it? Is there any other recommended package/way to do this? Author of jug here: jug works fine. I just tried the following and it works: from jug import TaskGenerator import pandas as pd import numpy as np @TaskGenerator def gendata(): return pd.DataFrame(np.arange(343440).reshape((10,-1))) @TaskGenerator def compute(x): return x

Persistent memoization in Python

巧了我就是萌 提交于 2019-12-03 06:23:20
I have an expensive function that takes and returns a small amount of data (a few integers and floats). I have already memoized this function, but I would like to make the memo persistent. There are already a couple of threads relating to this, but I'm unsure about potential issues with some of the suggested approaches, and I have some fairly specific requirements: I will definitely use the function from multiple threads and processes simultaneously (both using multiprocessing and from separate python scripts) I will not need read or write access to the memo from outside this python function I

How do you make a generic memoize function in Haskell?

*爱你&永不变心* 提交于 2019-12-03 03:46:58
问题 I've seen the other post about this, but is there a clean way of doing this in Haskell? As a 2nd part, can it also be done without making the function monadic? 回答1: This largely follows http://www.haskell.org/haskellwiki/Memoization. You want a function of type (a -> b). If it doesn't call itself, then you can just write a simple wrapper that caches the return values. The best way to store this mapping depends on what properties of a you can exploit. Ordering is pretty much a minimum. With

Haskell caching results of a function

人走茶凉 提交于 2019-12-03 02:36:37
I have a function that takes a parameter and produces a result. Unfortunately, it takes quite long for the function to produce the result. The function is being called quite often with the same input, that's why it would be convenient if I could cache the results. Something like let cachedFunction = createCache slowFunction in (cachedFunction 3.1) + (cachedFunction 4.2) + (cachedFunction 3.1) I was looking into Data.Array and although the array is lazy, I need to initialize it with a list of pairs (using listArray) - which is impractical . If the 'key' is e.g. the 'Double' type, I cannot

Which Ruby memoize pattern does ActiveSupport::Memoizable refer to?

坚强是说给别人听的谎言 提交于 2019-12-03 02:10:09
问题 So in Rails 3.2, ActiveSupport::Memoizable has been deprecated. The message reads: DEPRECATION WARNING: ActiveSupport::Memoizable is deprecated and will be removed in future releases,simply use Ruby memoization pattern instead. It refers to "Ruby memoization pattern" (singular) as if there's one pattern we should all know and refer to... I presume they mean something like: def my_method @my_method ||= # ... go get the value end or def my_method return @my_method if defined?(@my_method) @my