coroutine

C# Why shouldn't I ever use coroutines?

倖福魔咒の 提交于 2021-02-18 12:20:07
问题 One of the comments on this thread: Checking condition and calling continuous method with periods of delay unity, said that: Never never ever use coroutines. They teach bad habits from the point of view as a c# developer and will lead to a lynching if you take a regular c# job My question is, why is this? Is this just in Unity or in general? Unity's official virtual reality samples https://www.assetstore.unity3d.com/en/#!/content/51519 use them very heavily (especially the flyer example)

协程

不问归期 提交于 2021-02-17 01:58:16
之前都提到了几次协程,那么今天就来具体看看到底什么是协程,为什么要有协程,python如何实现协程的。 通过对生成器的学习,我们知道,yield val这行代码会给调用next(gen)的客户端产出一个值,然后暂停,把执行权限移交到客户端那里,等到下次客户端再次调用next(gen)的时候,再从yield后面的代码处接着执行,感觉是不是很像两个人在协作着完成一件事情? 没错,协程就是这样的,通过多个组件之间的互相协作,来完成某件事情。在python中,从语法上看,协程跟生成器类似,都是定义体中包含yield关键字的函数。但是,在协程中,yield通常出现在表达式的右边(如 ret = yield val),可以产出值,也可以不产出,如果yield后面没有表达式,那么生成器产出None,协程通常会从调用方接收数据,调用方把数据提供给协程的方法就是.send(data)方法。通常而言,调用方会把值推送给协程。 yield关键字甚至还可以不接收或者传出数据。不管数据如何流动,yield都可以看成是一种控制流程的工具,利用它可以实现协作式任务,这就是协程的工作流程模型。 生成器如何变成协程的 协程指的是一个过程,这个过程与调用方协作,产出由调用方提供的值。 除了.send()方法外,还有.thow(),.close()方法。 .send()   和next()类似,只是.send()方法

How to use dill library for object serialization with shelve library

烂漫一生 提交于 2021-02-16 21:26:09
问题 I'm using PyMemoize library to cache coroutine. I decorated the coroutine, but when Python calls it, I get: TypeError: can't pickle coroutine objects This happens because PyMemoize internally tries to pickle coroutine and store it inside Redis. For this, it uses shelve.Shelf , which in turn uses pickle . The problem is that, by unknown reason, pickle doesn't support pickling coroutines. I've tried to pickle coroutines with dill and it worked. How do I tell shelve to use dill as serialization

How to use dill library for object serialization with shelve library

て烟熏妆下的殇ゞ 提交于 2021-02-16 21:24:17
问题 I'm using PyMemoize library to cache coroutine. I decorated the coroutine, but when Python calls it, I get: TypeError: can't pickle coroutine objects This happens because PyMemoize internally tries to pickle coroutine and store it inside Redis. For this, it uses shelve.Shelf , which in turn uses pickle . The problem is that, by unknown reason, pickle doesn't support pickling coroutines. I've tried to pickle coroutines with dill and it worked. How do I tell shelve to use dill as serialization

Can C++20 coroutines be copied?

流过昼夜 提交于 2021-02-16 18:07:09
问题 I've been playing around with C++20 coroutines and trying to move some of my codebase over to use them. I've run into an issue, though, as it doesn't seem that the new coroutines can be copied. The generator objects have deleted copy-constructors and copy-assignment operators, and nothing I've looked into has seemed to have a way. Can this be done? For reference, I have written a little test program with a failing attempt at copying C++20 coroutines as well as a successful attempt to do the

Can C++20 coroutines be copied?

元气小坏坏 提交于 2021-02-16 18:02:01
问题 I've been playing around with C++20 coroutines and trying to move some of my codebase over to use them. I've run into an issue, though, as it doesn't seem that the new coroutines can be copied. The generator objects have deleted copy-constructors and copy-assignment operators, and nothing I've looked into has seemed to have a way. Can this be done? For reference, I have written a little test program with a failing attempt at copying C++20 coroutines as well as a successful attempt to do the

Difference between `asyncio.wait([asyncio.sleep(5)])` and `asyncio.sleep(5)`

非 Y 不嫁゛ 提交于 2021-02-15 07:33:49
问题 Could somebody please explain why there is a 5 second delay between coro2 finishing and coro1 finishing? Also, why is there no such delay if I replace asyncio.wait([asyncio.sleep(5)]) with asyncio.sleep(5) ? async def coro1(): logger.info("coro1 start") await asyncio.wait([asyncio.sleep(5)]) logger.info("coro1 finish") async def coro2(): logger.info("coro2 start") time.sleep(10) logger.info("coro2 finish") async def main(): await asyncio.gather(coro1(), coro2()) loop = asyncio.get_event_loop(

Run code on coroutine close()

雨燕双飞 提交于 2021-02-08 03:52:56
问题 I am writing code that uses coroutines heavily, and I want reliable behavior on shutdown. Say I have a coroutine and a context manager: from contextlib import contextmanager @contextmanager def print_context_manager(text): print("Enter", text) yield print("Exit", text) def coro(): with print_context_manager("coro"): while True: print("Loop", (yield)) I could use it like this: c = coro() next(c) c.send("Hello ") c.send("World!") c.close() Unfortunately, as far as I can tell, there is no way to

Run code on coroutine close()

心已入冬 提交于 2021-02-08 03:50:35
问题 I am writing code that uses coroutines heavily, and I want reliable behavior on shutdown. Say I have a coroutine and a context manager: from contextlib import contextmanager @contextmanager def print_context_manager(text): print("Enter", text) yield print("Exit", text) def coro(): with print_context_manager("coro"): while True: print("Loop", (yield)) I could use it like this: c = coro() next(c) c.send("Hello ") c.send("World!") c.close() Unfortunately, as far as I can tell, there is no way to

Run code on coroutine close()

两盒软妹~` 提交于 2021-02-08 03:50:12
问题 I am writing code that uses coroutines heavily, and I want reliable behavior on shutdown. Say I have a coroutine and a context manager: from contextlib import contextmanager @contextmanager def print_context_manager(text): print("Enter", text) yield print("Exit", text) def coro(): with print_context_manager("coro"): while True: print("Loop", (yield)) I could use it like this: c = coro() next(c) c.send("Hello ") c.send("World!") c.close() Unfortunately, as far as I can tell, there is no way to