lazy-evaluation

Hibernate could not initialize proxy - no Session

喜欢而已 提交于 2019-12-21 03:45:40
问题 My code retrieves all information related to the user: SessionFactory sessionFactory = HibernateUtilities.configureSessionFactory(); Session session = sessionFactory.openSession(); UserDetails ud = null; Set<Address> userAddress = null; try { session.beginTransaction(); ud = (UserDetails) session.get(UserDetails.class, 1); userAddress = ud.getAddresses(); session.getTransaction().commit(); } catch (HibernateException e) { e.printStackTrace(); session.getTransaction().rollback(); } finally {

When does a Stream need to be lazy?

独自空忆成欢 提交于 2019-12-21 03:43:28
问题 The following are both meant to create a Stream of integers: val s: Stream[Int] = 1 #:: s.map(_ + 1) def makeStream = { val s: Stream[Int] = 1 #:: s.map(_ + 1) s } The first is fine; however the makeStream method won't compile: error: forward reference extends over definition of value s val s: Stream[Int] = 1 #:: s.map(_ + 1) ^ It only compiles if we make s a lazy val . Why does it need to be a lazy val in a method, but not outside? 回答1: Inside a class, a val definition decompiles into an

When does a Stream need to be lazy?

限于喜欢 提交于 2019-12-21 03:43:20
问题 The following are both meant to create a Stream of integers: val s: Stream[Int] = 1 #:: s.map(_ + 1) def makeStream = { val s: Stream[Int] = 1 #:: s.map(_ + 1) s } The first is fine; however the makeStream method won't compile: error: forward reference extends over definition of value s val s: Stream[Int] = 1 #:: s.map(_ + 1) ^ It only compiles if we make s a lazy val . Why does it need to be a lazy val in a method, but not outside? 回答1: Inside a class, a val definition decompiles into an

How to optimize this short factorial function in scala? (Creating 50000 BigInts)

你离开我真会死。 提交于 2019-12-21 03:35:10
问题 I've compaired the scala version (BigInt(1) to BigInt(50000)).reduce(_ * _) to the python version reduce(lambda x,y: x*y, range(1,50000)) and it turns out, that the scala version took about 10 times longer than the python version. I'm guessing, a big difference is that python can use its native long type instead of creating new BigInt-objects for each number. But is there a workaround in scala? 回答1: The fact that your Scala code creates 50,000 BigInt objects is unlikely to be making much of a

Funky haskell lazy list implicit recursion

孤街醉人 提交于 2019-12-20 18:33:52
问题 In Haskell, you can build infinite lists due to laziness: Prelude> let g = 4 : g Prelude> g !! 0 4 Prelude> take 10 g [4,4,4,4,4,4,4,4,4,4] Now, what exactly goes on when I try to construct a list like this? Prelude> let f = f !! 10 : f Prelude> f !! 0 Interrupted. Prelude> take 10 f [Interrupted. Prelude> The Interrupted. s are me hitting CTRL+C after waiting a few seconds. It seems to go into an infinite loop, but why is that the case? Explanation for non-Haskellers: The : operator is

When are scala's for-comprehensions lazy?

梦想与她 提交于 2019-12-20 18:03:14
问题 In Python, I can do something like this: lazy = ((i,j) for i in range(0,10000) for j in range(0,10000)) sum((1 for i in lazy)) It will take a while, but the memory use is constant. The same construct in scala: (for(i<-0 to 10000; j<-i+1 to 10000) yield (i,j)).count((a:(Int,Int)) => true) After a while, I get a java.lang.OutOfMemoryError , even though it should be evaluated lazily. 回答1: Nothing's inherently lazy about Scala's for-comprehension; it's syntactic sugar* which won't change the fact

How to not fall into R's 'lazy evaluation trap'

情到浓时终转凉″ 提交于 2019-12-20 12:28:41
问题 "R passes promises, not values. The promise is forced when it is first evaluated, not when it is passed.", see this answer by G. Grothendieck. Also see this question referring to Hadley's book. In simple examples such as > funs <- lapply(1:10, function(i) function() print(i)) > funs[[1]]() [1] 10 > funs[[2]]() [1] 10 it is possible to take such unintuitive behaviour into account. However, I find myself frequently falling into this trap during daily development. I follow a rather functional

How to create lazy_evaluated dataframe columns in Pandas

天涯浪子 提交于 2019-12-20 12:15:25
问题 A lot of times, I have a big dataframe df to hold the basic data, and need to create many more columns to hold the derivative data calculated by basic data columns. I can do that in Pandas like: df['derivative_col1'] = df['basic_col1'] + df['basic_col2'] df['derivative_col2'] = df['basic_col1'] * df['basic_col2'] .... df['derivative_coln'] = func(list_of_basic_cols) etc. Pandas will calculate and allocate the memory for all derivative columns all at once. What I want now is to have a lazy

Accessing a non-static member via Lazy<T> or any lambda expression

戏子无情 提交于 2019-12-20 08:56:08
问题 I have this code: public class MyClass { public int X { get; set; } public int Y { get; set; } private Lazy<int> lazyGetSum = new Lazy<int>(new Func<int>(() => X + Y)); public int Sum{ get { return lazyGetSum.Value; } } } Gives me this error: A field initializer cannot reference the non-static field, method, or property. I think it is very reasonable to access a non-static member via lazy, how to do this? * EDIT * The accepted answer solves the problem perfectly, but to see the detailed and

Efficient table for Dynamic Programming in Haskell

亡梦爱人 提交于 2019-12-20 08:45:07
问题 I've coded up the 0-1 Knapsack problem in Haskell. I'm fairly proud about the laziness and level of generality achieved so far. I start by providing functions for creating and dealing with a lazy 2d matrix. mkList f = map f [0..] mkTable f = mkList (\i -> mkList (\j -> f i j)) tableIndex table i j = table !! i !! j I then make a specific table for a given knapsack problem knapsackTable = mkTable f where f 0 _ = 0 f _ 0 = 0 f i j | ws!!i > j = leaveI | otherwise = max takeI leaveI where takeI