future

Future.wait() can't wait without a fiber (while waiting on another future in Meteor.method)

霸气de小男生 提交于 2019-12-01 01:16:25
问题 In Meteor , I'm writing a method that will have to check a certain path's subdirectories for new files. I first would like to just list the subdirectories within Meteor after which I child_process.exec a simple bash script that lists files added since the last time it executed. I'm having some issues getting the directory discovery to be async ( Error: Can't wait without a fiber ). I've written a synchronous version but having both fs.readdir and fs.stat in stead of their synchronous

Wait until all Future.onComplete callbacks are executed

﹥>﹥吖頭↗ 提交于 2019-11-30 19:37:29
I am using the Future API from Scala 2.10.X. Here is my use case: object Class1 { def apply(f: (Int) => Future[String])(i: Int): Future[String] = { val start = DateTime.now val result = f(i) result.onComplete{ case _ => println("Started at " + start + ", ended at " + DateTime.now) } result } } Pretty simple I think: I'm adding an onComplete callback to my future. Now, I'm wondering if there is a way to add a callback for when the onComplete is done executing - in this example know when the logging is done. Let's say my result instance has 3 onComplete registered, can I know when all of them

Kill or timeout a Future in Scala 2.10

廉价感情. 提交于 2019-11-30 18:50:54
Hi, I'm using Scala 2.10 with the new futures library and I'm trying to write some code to test an infinite loop. I use a scala.concurrent.Future to run the code with the loop in a separate thread. I would then like to wait a little while to do some testing and then kill off the separate thread/future. I have looked at Await.result but that doesn't actually kill the future. Is there any way to timeout or kill the new Scala 2.10 futures? I would prefer not having to add external dependencies such as Akka just for this simple part. Do not try it at home. import scala.concurrent._ import scala

How do you post a boost packaged_task to an io_service in C++03?

筅森魡賤 提交于 2019-11-30 18:41:48
This is a follow-on from a previous question ( here ), but I'm working on a multithreaded application and I would like to post a Boost packaged_task to a threaded io_service. I'm stuck using a C++03 compiler (so std::move is out), and the packaged_task is not copyable. I've tried wrapping it in a shared_ptr and passing that, and a lot of other things. Here is my current attempt and the subsequent compiler errors. Any idea how to get this to work? boost::asio::io_service io_service; boost::thread_group threads; boost::asio::io_service::work work(io_service); for (int i = 0; i < maxNumThreads; +

Why doesn't Scala's Future have a .get / get(maxDuration) method, forcing us to resort to Await.result() instead?

。_饼干妹妹 提交于 2019-11-30 18:12:25
Is there any particular advantage in decoupling the get method from the Future class (where I'd expect it to reside) and to instead force the coder to have to know about this external two-method class called Await ? Is there any particular advantage in decoupling the get method from the Future class Yes, to make it difficult for the developer to do the wrong thing . A Future represents a computation which will complete in the future and might not be available at the current invocation point. If you need to block a future, why not execute it synchronously? Whats the point of scheduling it on

Can a scheduled future cause a memory leak?

喜夏-厌秋 提交于 2019-11-30 14:52:28
问题 I think I have a memory leak in my Android live wallpaper. Whenever I rotate the screen, the amount of memory garbage collected increases by 50kb and doesn't go back down. I think it may be caused by a scheduled future, so I'm going to present a scenario to see if that's the case. Let's say you have a class (let's call it Foo) that has the following members. private ScheduledFuture<?> future; private final ScheduledExecutorService scheduler = Executors .newSingleThreadScheduledExecutor();

How do I wrap a java.util.concurrent.Future in an Akka Future?

人走茶凉 提交于 2019-11-30 11:51:24
问题 In a Play Framework 2.0.1 (Scala) application, we are using a web service client library which returns java.util.concurrent.Future as responses. Instead of blocking the Play app on the get() call, we'd like to wrap the j.u.c.Future in an akka.dispatch.Future , so that we can easily use the play framework's AsyncResult processing. Has anyone done this before, or have a library or example code? UPDATE : The closest thing we've found is this google groups discussion: https://groups.google.com

Can a scheduled future cause a memory leak?

隐身守侯 提交于 2019-11-30 11:48:21
I think I have a memory leak in my Android live wallpaper. Whenever I rotate the screen, the amount of memory garbage collected increases by 50kb and doesn't go back down. I think it may be caused by a scheduled future, so I'm going to present a scenario to see if that's the case. Let's say you have a class (let's call it Foo) that has the following members. private ScheduledFuture<?> future; private final ScheduledExecutorService scheduler = Executors .newSingleThreadScheduledExecutor(); private final Runnable runnable = new Runnable() { public void run() { // Do stuff } }; And now you set a

How do I wait for a Scala future's onSuccess callback to complete?

£可爱£侵袭症+ 提交于 2019-11-30 11:36:24
In Scala, I can use Await to wait for a future to complete. However, if I have registered a callback to run upon completion of that future, how can I wait not only for the future to complete but also for that callback to finish? Here is a minimal but complete program to illustrate the problem: import scala.concurrent.ExecutionContext.Implicits.global import scala.concurrent.duration.Duration import scala.concurrent.{ Await, Future } object Main { def main(args: Array[String]): Unit = { val f: Future[Int] = Future(0) f.onSuccess { case _ => Thread.sleep(10000) println("The program waited

ProcessPoolExecutor from concurrent.futures way slower than multiprocessing.Pool

血红的双手。 提交于 2019-11-30 10:29:39
问题 I was experimenting with the new shiny concurrent.futures module introduced in Python 3.2, and I've noticed that, almost with identical code, using the Pool from concurrent.futures is way slower than using multiprocessing.Pool. This is the version using multiprocessing: def hard_work(n): # Real hard work here pass if __name__ == '__main__': from multiprocessing import Pool, cpu_count try: workers = cpu_count() except NotImplementedError: workers = 1 pool = Pool(processes=workers) result =