lazy-evaluation

How do I make this algorithm lazier without repeating myself?

淺唱寂寞╮ 提交于 2020-01-03 10:05:39
问题 (Inspired by my answer to this question.) Consider this code (it's supposed to find the largest element that's less than or equal to a given input): data TreeMap v = Leaf | Node Integer v (TreeMap v) (TreeMap v) deriving (Show, Read, Eq, Ord) closestLess :: Integer -> TreeMap v -> Maybe (Integer, v) closestLess i = precise Nothing where precise :: Maybe (Integer, v) -> TreeMap v -> Maybe (Integer, v) precise closestSoFar Leaf = closestSoFar precise closestSoFar (Node k v l r) = case i

Initialisation order throws null pointer on lazy val access

大兔子大兔子 提交于 2020-01-02 08:09:48
问题 Expectedly, the following initialisation order without lazy val throws null pointer exception class Foo { Bar.x // NullPointerException } object Bar extends Foo { val x = 42 } object Hello extends App { Bar } Examining -Xprint:jvm output, and referencing @paradigmatic answer, we see this is due to Foo 's constructor running first and calling Bar.x() before Bar.this.x is initialised in Bar 's constructor: class Foo extends Object { def <init>(): example.Foo = { Foo.super.<init>(); Bar.x(); ()

Initialisation order throws null pointer on lazy val access

纵然是瞬间 提交于 2020-01-02 08:08:06
问题 Expectedly, the following initialisation order without lazy val throws null pointer exception class Foo { Bar.x // NullPointerException } object Bar extends Foo { val x = 42 } object Hello extends App { Bar } Examining -Xprint:jvm output, and referencing @paradigmatic answer, we see this is due to Foo 's constructor running first and calling Bar.x() before Bar.this.x is initialised in Bar 's constructor: class Foo extends Object { def <init>(): example.Foo = { Foo.super.<init>(); Bar.x(); ()

Scala Infinite Iterator OutOfMemory

落花浮王杯 提交于 2020-01-02 05:33:08
问题 I'm playing around with Scala's lazy iterators, and I've run into an issue. What I'm trying to do is read in a large file, do a transformation, and then write out the result: object FileProcessor { def main(args: Array[String]) { val inSource = Source.fromFile("in.txt") val outSource = new PrintWriter("out.txt") try { // this "basic" lazy iterator works fine // val iterator = inSource.getLines // ...but this one, which incorporates my process method, // throws OutOfMemoryExceptions val

How to get make stats in constant memory

孤者浪人 提交于 2020-01-01 18:19:54
问题 I have a function, which creates some random numerical results. I know, that the result will be an integer in a (small, a - b approx 50) range a, b . I want to create a function which execute the above function let's say 1000000 times and calculates, how often the each result appears. (The function takes a random generator to produce the result.) The problem is, I don't know how to do this in constant memory without hard-coding the range's length. My (bad) approach is like this: values ::

MATLAB variable passing and lazy assignment

自作多情 提交于 2020-01-01 12:07:55
问题 I know that in Matlab, there is a 'lazy' evaluation when a new variable is assigned to an existing one. Such as: array1 = ones(1,1e8); array2 = array1; The value of array1 won't be copied to array2 unless the element of array2 is modified. From this I supposed that all the variables in Matlab are actually value-type and are all passed by values (although lazy evaluation is used). This also implies that the variables are created on the call stack. Well, I am not judging the way it treats the

Use of lazy val for caching string representation

ぃ、小莉子 提交于 2020-01-01 09:42:00
问题 I encountered the following code in JAXMag's Scala special issue: package com.weiglewilczek.gameoflife case class Cell(x: Int, y: Int) { override def toString = position private lazy val position = "(%s, %s)".format(x, y) } Does the use of lazy val in the above code provide considerably more performance than the following code? package com.weiglewilczek.gameoflife case class Cell(x: Int, y: Int) { override def toString = "(%s, %s)".format(x, y) } Or is it just a case of unnecessary

Scala: Streams not acting lazy?

若如初见. 提交于 2020-01-01 07:56:10
问题 I know streams are supposed to be lazily evaluated sequences in Scala, but I think I am suffering from some sort of fundamental misunderstanding because they seem to be more eager than I would have expected. In this example: val initial = Stream(1) lazy val bad = Stream(1/0) println((initial ++ bad) take 1) I get a java.lang.ArithmeticException , which seems to be cause by zero division. I would expect that bad would never get evaluated since I only asked for one element from the stream. What

Parsing: Lazy initialization and mutually recursive monads in F#

谁都会走 提交于 2020-01-01 06:44:09
问题 I've been writing a little monadic parser-combinator library in F# (somewhat similar to FParsec) and now tried to implement a small parser for a programming language. I first implemented the code in Haskell (with Parsec) which ran perfectly well. The parsers for infix expressions are designed mutually recursive. parseInfixOp :: Parser String -> Parser Expression -> Parser Expression parseInfixOp operatorParser subParser = ignoreSpaces $ do x <- ignoreSpaces $ subParser do op <- ignoreSpaces $

Parsing: Lazy initialization and mutually recursive monads in F#

风流意气都作罢 提交于 2020-01-01 06:44:08
问题 I've been writing a little monadic parser-combinator library in F# (somewhat similar to FParsec) and now tried to implement a small parser for a programming language. I first implemented the code in Haskell (with Parsec) which ran perfectly well. The parsers for infix expressions are designed mutually recursive. parseInfixOp :: Parser String -> Parser Expression -> Parser Expression parseInfixOp operatorParser subParser = ignoreSpaces $ do x <- ignoreSpaces $ subParser do op <- ignoreSpaces $