memoization

python resettable instance method memoization decorator

会有一股神秘感。 提交于 2019-11-29 02:04:10
I'm attempting to build a decorator for an instance method of a class that will memoize the result. (This has been done a million times before) However, I'd like the option of being able to reset the memoized cache at any point (say, if something in the instance state changes, which might change the result of the method having nothing to do with its args). So, I attempted to build a decorator as a class instead of a function, so that I might have access to the cache as a class member. This led me down the path of learning about descriptors, specifically the __get__ method, which is where I'm

Two parameter memoization in Haskell

Deadly 提交于 2019-11-29 01:35:07
I'm trying to memoize the following function: gridwalk x y | x == 0 = 1 | y == 0 = 1 | otherwise = (gridwalk (x - 1) y) + (gridwalk x (y - 1)) Looking at this I came up with the following solution: gw :: (Int -> Int -> Int) -> Int -> Int -> Int gw f x y | x == 0 = 1 | y == 0 = 1 | otherwise = (f (x - 1) y) + (f x (y - 1)) gwlist :: [Int] gwlist = map (\i -> gw fastgw (i `mod` 20) (i `div` 20)) [0..] fastgw :: Int -> Int -> Int fastgw x y = gwlist !! (x + y * 20) Which I then can call like this: gw fastgw 20 20 Is there an easier, more concise and general way (notice how I had to hardcode the

Numpy NdArray Memoization

情到浓时终转凉″ 提交于 2019-11-28 23:15:21
问题 I'm working on some fairly computational intensive calculations that deal with numpy matrices and ndarrays, and from some digging around, there are about a dozen ways not to implement memoization, generally full of collisions, and issues with ndarrays being mutable objects. Has anyone come across a fairly general memoisation decorator that can handle numpy objects? 回答1: How about this package: http://packages.python.org/joblib/memory.html 回答2: An alternative is my package jug: http://packages

Store the cache to a file functools.lru_cache in Python >= 3.2

我们两清 提交于 2019-11-28 23:12:19
I'm using @functools.lru_cache in Python 3.3. I would like to save the cache to a file, in order to restore it when the program will be restarted. How could I do? Edit 1 Possible solution: We need to pickle any sort of callable Problem pickling __closure__ : _pickle.PicklingError: Can't pickle <class 'cell'>: attribute lookup builtins.cell failed If I try to restore the function without it, I get: TypeError: arg 5 (closure) must be tuple Bakuriu You can't do what you want using lru_cache , since it doesn't provide an API to access the cache, and it might be rewritten in C in future releases.

Factorial Memoization in R

瘦欲@ 提交于 2019-11-28 11:45:17
I wrote this function to find a factorial of number fact <- function(n) { if (n < 0){ cat ("Sorry, factorial does not exist for negative numbers", "\n") } else if (n == 0){ cat ("The factorial of 0 is 1", "\n") } else { results = 1 for (i in 1:n){ results = results * i } cat(paste("The factorial of", n ,"is", results, "\n")) } } Now I want to implement Memoization in R. I have Basic idea on R and trying to implement using them. But I am not sure is this way forward. Could you please also elaborate this topic as well. Thank in advance. Memoized Factorial fact_tbl <- c(0, 1, rep(NA, 100)) fact

C++ Memoization understanding

拜拜、爱过 提交于 2019-11-28 11:00:19
问题 I was trying to understand how memoization works in C++, so I looked at an example of memoization used in Fib. sequence. std::map<int, int> fibHash; int memoized_fib (int n) { std::map<int, int>::iterator fibIter = fibHash.find(n); if( fibIter != fibHash.end() ) return *fibIter; int fib_val; if( n <=1 ) fib_val = 1; else fib_val = memoized_fib ( n-1 ) + memoized_fib ( n-2 ); fibHash[ n ] = fib_val; return fib_val; } I was a little confused with how the fibHash[n] works. Does it just hold the

How to perform thread-safe function memoization in c#?

笑着哭i 提交于 2019-11-28 10:55:39
Here on stack overflow I've found the code that memoizes single-argument functions: static Func<A, R> Memoize<A, R>(this Func<A, R> f) { var d = new Dictionary<A, R>(); return a=> { R r; if (!d.TryGetValue(a, out r)) { r = f(a); d.Add(a, r); } return r; }; } While this code does its job for me, it fails sometimes when the memoized function is called from the multiple threads simultaneously: the Add method gets called twice with the same argument and throws an exception. How can I make the memoization thread-safe? Gman You can use ConcurrentDictionary.GetOrAdd which does everything you need:

How do I write a generic memoize function?

喜欢而已 提交于 2019-11-28 07:41:30
I'm writing a function to find triangle numbers and the natural way to write it is recursively: function triangle (x) if x == 0 then return 0 end return x+triangle(x-1) end But attempting to calculate the first 100,000 triangle numbers fails with a stack overflow after a while. This is an ideal function to memoize , but I want a solution that will memoize any function I pass to it. Lee Baldwin I bet something like this should work with variable argument lists in Lua: local function varg_tostring(...) local s = select(1, ...) for n = 2, select('#', ...) do s = s..","..select(n,...) end return s

Automatic memoizing in functional programming languages

谁都会走 提交于 2019-11-28 06:50:59
I always thought that Haskell would do some sort of automatic intelligent memoizing. E.g., the naive Fibonacci implementation fib 0 = 0 fib 1 = 1 fib n = fib (n-2) + fib (n-1) would be fast because of that. Now I read this and it seems I was wrong -- Haskell doesn't seem to do automatic memoization. Or do I understand something wrong? Are there other languages which do automatic (i.e. implicit, not explicit) memoization? What are common ways to implement memoization? In all sample implementations I have seen, they use a hashmap but there isn't any limit in its size. Obviously, this wouldn't

Why do I get the value “result” for this closure?

為{幸葍}努か 提交于 2019-11-28 06:21:19
问题 Let's say I have this code (fiddle) intended to memoize modules: var chat = { // Create this closure to contain the cached modules module: function() { // Internal module cache. var modules = {}; console.log('in module:', name); // <---------- "in return: result" // Create a new module reference scaffold or load an // existing module. return function(name) { console.log('in return:', name); // <---------- "in return: derp" // If this module has already been created, return it. if (modules