concurrency

Get KeyValuePair by Key from a ConcurrentDictionary (in O(1) time)

偶尔善良 提交于 2020-04-30 02:13:20
问题 As per this solution (https://stackoverflow.com/a/18923091/529618) I am using a ConcurrentDictionary<T,byte> as a workaround for the lack of ConcurrentHashSet<T> . However, I'm struggling to see how I can get the original T Key back out of the dictionary in O(1) time. var cache = new ConcurrentDictionary<MyEquatableClass, byte>()); //... if(!cache.TryAdd(classInstance, Byte.MinValue)) return /* Existing cache entry */; return classInstance; Is there any way to get the KeyValuePair<K,V> (or

How to write Unit test case for adding callback for ListenableFuture

百般思念 提交于 2020-04-28 09:48:02
问题 I am trying to write the unit test case for ListenableFuture adding Callback but I am not sure how to do it. Didn`t get anything useful on internet. @Test public void can_publish_data_to_kafka() { String topic = someString(10); String key = someAlphanumericString(5); String data = someString(50); SendResult sendResult = mock(SendResult.class); ListenableFuture<SendResult<String, Object>> future = mock(ListenableFuture.class); given(kafkaTemplate.send(topic, key, data)).willReturn(future);

How to write Unit test case for adding callback for ListenableFuture

北战南征 提交于 2020-04-28 09:46:27
问题 I am trying to write the unit test case for ListenableFuture adding Callback but I am not sure how to do it. Didn`t get anything useful on internet. @Test public void can_publish_data_to_kafka() { String topic = someString(10); String key = someAlphanumericString(5); String data = someString(50); SendResult sendResult = mock(SendResult.class); ListenableFuture<SendResult<String, Object>> future = mock(ListenableFuture.class); given(kafkaTemplate.send(topic, key, data)).willReturn(future);

How would I go about using concurrent.futures and queues for a real-time scenario?

匆匆过客 提交于 2020-04-27 23:28:56
问题 It is fairly easy to do parallel work with Python 3's concurrent.futures module as shown below. with concurrent.futures.ThreadPoolExecutor(max_workers=10) as executor: future_to = {executor.submit(do_work, input, 60): input for input in dictionary} for future in concurrent.futures.as_completed(future_to): data = future.result() It is also very handy to insert and retrieve items into a Queue. q = queue.Queue() for task in tasks: q.put(task) while not q.empty(): q.get() I have a script running

How would I go about using concurrent.futures and queues for a real-time scenario?

我怕爱的太早我们不能终老 提交于 2020-04-27 23:27:23
问题 It is fairly easy to do parallel work with Python 3's concurrent.futures module as shown below. with concurrent.futures.ThreadPoolExecutor(max_workers=10) as executor: future_to = {executor.submit(do_work, input, 60): input for input in dictionary} for future in concurrent.futures.as_completed(future_to): data = future.result() It is also very handy to insert and retrieve items into a Queue. q = queue.Queue() for task in tasks: q.put(task) while not q.empty(): q.get() I have a script running

How to solve ClickHouse deadlock?

感情迁移 提交于 2020-04-18 12:35:51
问题 I was doing a set of 10 concurrent tests when ClickHouse became deadlocked. The following SQL select id, sum(a) as a, sum(b) as b, sum(c) as c, round(sum(d), 2) as d from f_table where xxx And I ran pstack my-clickhouse-server-process-id and got some __lll_lock_wait . Sorry for posting so many thread stack logs, I thought more information may give you some ideas. Since this reproduce is not stable at present, I haven't posted it on GitHub's issue. I read https://github.com/ClickHouse

Concurrent file accesses from different scripts python

泪湿孤枕 提交于 2020-04-17 20:27:06
问题 I have several scripts. Each of them does some computation and it is completely independent from the others. Once these computations are done, they will be saved to disk and a record updated. The record is maintained by an instance of a class, which saves itself to disks. I would like to have a single record instance used in multiple scripts (for example, record_manager = RecordManager(file_on_disk) . And then record_manager.update(...) ); but I can't do this right now, because when updating

Properly setting up a simple server-side cache

你离开我真会死。 提交于 2020-04-16 05:35:54
问题 I'm trying to set up a server-side cache properly and I'm looking for constructive criticism on the setup I have currently. The cache is loaded when the Servlet starts and never changed again, so in effect it's a read-only cache. It obviously needs to stay in memory for the lifetime of the Servlet. Here's how I have it set-up private static List<ProductData> _cache; private static ProductManager productManager; private ProductManager() { try { lookup(); } catch (Exception ex) { _cache = null;

Properly setting up a simple server-side cache

∥☆過路亽.° 提交于 2020-04-16 05:33:10
问题 I'm trying to set up a server-side cache properly and I'm looking for constructive criticism on the setup I have currently. The cache is loaded when the Servlet starts and never changed again, so in effect it's a read-only cache. It obviously needs to stay in memory for the lifetime of the Servlet. Here's how I have it set-up private static List<ProductData> _cache; private static ProductManager productManager; private ProductManager() { try { lookup(); } catch (Exception ex) { _cache = null;

Is it possible to terminate a promise's code block from another promise?

♀尐吖头ヾ 提交于 2020-04-12 09:31:11
问题 I wrote this test program: await Promise.anyof( Promise.allof((^5).map: {start { sleep 10; say "done $_" } }), Promise.in(5).then: { say 'ouch' } ); sleep 10; When the second promise times out it prints 'ouch' and the await exits, but the first promise's code block is still running. After five more seconds its five processes end and print 'done': $ ./test1.p6 ouch done 0 done 1 done 2 done 3 done 4 I tried to terminate the first promise assigning it to a variable and then calling the .break