lazy-sequences

What will the behaviour of line-seq be?

馋奶兔 提交于 2019-12-13 00:09:50
问题 I'd like to understand the behaviour of a lazy sequence if I iterate over with doseq but hold onto part of the first element. (with-open [log-file-reader (clojure.java.io/reader (clojure.java.io/file input-file-path))] ; Parse line parse-line returns some kind of representation of the line. (let [parsed-lines (map parse-line (line-seq log-file-reader)) first-item (first parsed-lines)] ; Iterate over the parsed lines (doseq [line parsed-lines] ; Do something with a side-effect ))) I don't want

lazy version of mapM

不羁岁月 提交于 2019-12-12 07:46:44
问题 Suppose, I'm getting large list of items while working with IO: as <- getLargeList Now, I'm trying to apply fn :: a -> IO b onto as : as <- getLargeList bs <- mapM fn as mapM has type mapM :: Monad m => (a -> m b) -> [a] -> m [b] , and that's what I need in terms of type matching. But it builds all the chain in memory until return the result. I'm looking for analog of mapM , which will work lazily, so that I may use head of bs while tail is still building. 回答1: Do not use unsafeInterleaveIO

How do I create a call-by-need list with increasing size in Standard ML?

冷暖自知 提交于 2019-12-11 07:24:04
问题 I am trying to create a lazy list with list elements which together represent all the combinations of zeros and ones. Example: [[], [0], [1], [0,0], [0,1], [1,0]...] Is this even possible in ML? I can't seem to find a way to change the pattern of the list elements once I have defined it. It seems that there is also a need to define a change in the binary pattern, which is not really possible in a functional language (I've never encountered binary representations in functional language)? 回答1:

How can I create a lazy-seq vector

点点圈 提交于 2019-12-10 14:16:39
问题 Running this works as expected: (defn long-seq [n] (lazy-seq (cons (list n {:somekey (* n 2)}) (long-seq (+ n 1))))) (take 3 (long-seq 3)) ; => ((3 {:somekey 6}) (4 {:somekey 8}) (5 {:somekey 10})) However I would like to do the same thing with a vector: (defn long-seq-vec [n] (lazy-seq (into (vector (list n {:somekey (* n 2)})) (long-seq-vec (+ n 1))))) (take 3 (long-seq-vec 3)) This gives me a stack overflow. Why? 回答1: The main reason is that vectors aren't lazy - so the into call greedily

Infinite fibonacci sequence

僤鯓⒐⒋嵵緔 提交于 2019-12-09 16:17:57
问题 I'm trying to imitate Haskell's famous infinite fibonacci list in F# using sequences. Why doesn't the following sequence evaluate as expected? How is it being evaluated? let rec fibs = lazy (Seq.append (Seq.ofList [0;1]) ((Seq.map2 (+) (fibs.Force()) (Seq.skip 1 (fibs.Force()))))) 回答1: The problem is that your code still isn't lazy enough: the arguments to Seq.append are evaluated before the result can be accessed, but evaluating the second argument ( Seq.map2 ... ) requires evaluating its

Lazy fibonacci in Ruby

泄露秘密 提交于 2019-12-08 00:08:55
问题 I can write a lazy fibonacci in Clojure like this: (def fib (lazy-cat [1 1] (map +' fib (rest fib)))) and I'm trying (unsuccessfully) to write it in Ruby like this: fib = Enumerator.new do |yielder| yielder << 1 << 1 fib.zip(fib.drop(1)).map do |a,b| yielder << (a + b) end end In the simplified case, this works: fib = Enumerator.new do |yielder| yielder << 1 << 1 puts "here" end puts fib.take(2).inspect puts fib.drop(1).take(1).inspect but this doesn't: fib = Enumerator.new do |yielder|

Proving equality on coinductive lazy lists in Coq

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-07 23:14:27
问题 I am experimenting with Coq Coinductive types. I use the lazy list type form the Coq'Art book (sect. 13.1.4): Set Implicit Arguments. CoInductive LList (A:Set) : Set := | LNil : LList A | LCons : A -> LList A -> LList A. Implicit Arguments LNil [A]. CoFixpoint LAppend (A:Set) (u v:LList A) : LList A := match u with | LNil => v | LCons a u' => LCons a (LAppend u' v) end. In order to match the guard condition I also use the following decomposition functions form this book: Definition LList

How to produce a lazy sequence by portion in clojure?

雨燕双飞 提交于 2019-12-07 07:20:46
问题 I have a database server and I fetch data from it. Sometimes data have millions rows and more so I use laziness for downloading. I use Server Side Cursors from clojure.jdbc library https://funcool.github.io/clojure.jdbc/latest/#cursor-queries to fetch data lazily. Now I have a problem. I need produce initial 500 elements from a lazy-sequence, then the program must wait for 10 minutes to get a signal which report to the program produce next 500 elements and so on until i receive all data from

Scala error: “forward reference extends over definition of value” when code appears in a function

社会主义新天地 提交于 2019-12-06 16:30:04
问题 I'm trying to compile the following code, using Scala 2.11.7. object LucasSeq { val fibo: Stream[Int] = 0 #:: 1 #:: fibo.zip(fibo.tail).map { pair => pair._1 + pair._2 } def firstKind(p: Int, q: Int): Stream[Int] = { val lucas: Stream[Int] = 0 #:: 1 #:: lucas.zip(lucas.tail).map { pair => p * pair._2 - q * pair._1 } lucas } } fibo is based on the Fibonacci sequence example in Scala's Stream documentation, and it works. However, the firstKind function, which tries to generalize the sequence

Proving equality on coinductive lazy lists in Coq

南笙酒味 提交于 2019-12-06 08:40:12
I am experimenting with Coq Coinductive types. I use the lazy list type form the Coq'Art book (sect. 13.1.4): Set Implicit Arguments. CoInductive LList (A:Set) : Set := | LNil : LList A | LCons : A -> LList A -> LList A. Implicit Arguments LNil [A]. CoFixpoint LAppend (A:Set) (u v:LList A) : LList A := match u with | LNil => v | LCons a u' => LCons a (LAppend u' v) end. In order to match the guard condition I also use the following decomposition functions form this book: Definition LList_decomp (A:Set) (l:LList A) : LList A := match l with | LNil => LNil | LCons a l' => LCons a l' end. Lemma