generator

Is ANTLR an appropriate tool to serialize/deserialize a binary data format?

泪湿孤枕 提交于 2020-01-13 08:03:09
问题 I need to read and write octet streams to send over various networks to communicate with smart electric meters. There is an ANSI standard, ANSI C12.19, that describes the binary data format. While the data format is not overly complex the standard is very large (500+ pages) in that it describes many distinct types. The standard is fully described by an EBNF grammar. I am considering utilizing ANTLR to read the EBNF grammar or a modified version of it and create C# classes that can read and

How do I reverse an itertools.chain object?

删除回忆录丶 提交于 2020-01-12 14:51:11
问题 My function creates a chain of generators: def bar(num): import itertools some_sequence = (x*1.5 for x in range(num)) some_other_sequence = (x*2.6 for x in range(num)) chained = itertools.chain(some_sequence, some_other_sequence) return chained My function sometimes needs to return chained in reversed order. Conceptually, the following is what I would like to be able to do: if num < 0: return reversed(chained) return chained Unfortunately: >>> reversed(chained) TypeError: argument to reversed

How do I reverse an itertools.chain object?

扶醉桌前 提交于 2020-01-12 14:50:47
问题 My function creates a chain of generators: def bar(num): import itertools some_sequence = (x*1.5 for x in range(num)) some_other_sequence = (x*2.6 for x in range(num)) chained = itertools.chain(some_sequence, some_other_sequence) return chained My function sometimes needs to return chained in reversed order. Conceptually, the following is what I would like to be able to do: if num < 0: return reversed(chained) return chained Unfortunately: >>> reversed(chained) TypeError: argument to reversed

How do I reverse an itertools.chain object?

末鹿安然 提交于 2020-01-12 14:50:11
问题 My function creates a chain of generators: def bar(num): import itertools some_sequence = (x*1.5 for x in range(num)) some_other_sequence = (x*2.6 for x in range(num)) chained = itertools.chain(some_sequence, some_other_sequence) return chained My function sometimes needs to return chained in reversed order. Conceptually, the following is what I would like to be able to do: if num < 0: return reversed(chained) return chained Unfortunately: >>> reversed(chained) TypeError: argument to reversed

Unexpected behaviour with a conditional generator expression [duplicate]

醉酒当歌 提交于 2020-01-11 15:33:56
问题 This question already has answers here : Generator expression uses list assigned after the generator's creation (5 answers) Closed 12 months ago . I was running a piece of code that unexpectedly gave a logic error at one part of the program. When investigating the section, I created a test file to test the set of statements being run and found out an unusual bug that seems very odd. I tested this simple code: array = [1, 2, 2, 4, 5] # Original array f = (x for x in array if array.count(x) ==

Recursive Generators in Python

百般思念 提交于 2020-01-10 14:20:34
问题 I wrote a function to return a generator containing every unique combination of sub-strings a given length that contain more than n elements from a primary string. As an illustration: if i have 'abcdefghi' and a probe of length of two, and a threshold of 4 elements per list i'd like to get: ['ab', 'cd', 'ef', 'gh'] ['ab', 'de', 'fg', 'hi'] ['bc', 'de', 'fg', 'hi'] My first attempt at this problem involved returning a list of lists. This ended up overflowing the memory of the computer. As a

Split a generator into chunks without pre-walking it

拜拜、爱过 提交于 2020-01-08 11:27:58
问题 (This question is related to this one and this one, but those are pre-walking the generator, which is exactly what I want to avoid) I would like to split a generator in chunks. The requirements are: do not pad the chunks: if the number of remaining elements is less than the chunk size, the last chunk must be smaller. do not walk the generator beforehand: computing the elements is expensive, and it must only be done by the consuming function, not by the chunker which means, of course: do not

Creating an array from ES6 generator

懵懂的女人 提交于 2020-01-07 05:42:06
问题 Let's say I want to assign the result of an ES6 generator to an array variable. function* gen() { for(let i = 0; i < 3; i++) { yield i; } } let [...b] = gen(); console.log(b); // [0, 1, 2] Here b will be assigned [0, 1, 2] . Why does this work? 回答1: A generator, when invoked, returns an iterator. We could for example loop through the iterator with a for … of loop: for (const item of gen()) { console.log(item); } Which would just go through each item in the generator: 0 1 2 The same things

Prime number generator crashes from memory error if there are too many numbers in array

£可爱£侵袭症+ 提交于 2020-01-07 02:29:24
问题 I have a prime number generator, I was curious to see how small and how fast I could get a prime number generator to be based on optimizations and such: from math import sqrt def p(n): if n < 2: return [] s = [True]*(((n/2)-1+n%2)+1) for i in range(int(sqrt(n)) >> 1): if not s[i]: continue for j in range( (i**i+(3*i) << 1) + 3, ((n/2)-1+n%2), (i<<1)+3): s[j] = False q = [2]; q.extend([(i<<1) + 3 for i in range(((n/2)-1+n%2)) if s[i]]); return len(q), q print p(input()) The generator works

Lazy Fibonacci Sequence in Ruby

梦想与她 提交于 2020-01-06 12:49:11
问题 Coming from Python if I wanted to create an iterative lazy Fibonacci sequence, I could do something like this: def fib(): a = 1 b = 2 yield a yield b while True: yield a + b tmp = a a = b b = tmp + b Grabbing next(fib) will give me the next element in the sequence by simply adding the previous two elements, so if I want to get the first 1000 Fibonacci elements, I can do that quickly: fib = fib() for i in range(0,1000): print(next(fib)) If I try to reproduce that in Ruby with an Enumerator, it