akka-stream

How to get started with Akka Streams? [closed]

寵の児 提交于 2019-12-20 07:56:39
问题 Closed . This question needs to be more focused. It is not currently accepting answers. Want to improve this question? Update the question so it focuses on one problem only by editing this post. Closed 3 years ago . The Akka Streams library already comes with quite a wealth of documentation. However, the main problem for me is that it provides too much material - I feel quite overwhelmed by the number of concepts that I have to learn. Lots of examples shown there feel very heavyweight and can

How to make a POST call to self-certified server with akka-http

落花浮王杯 提交于 2019-12-20 05:24:09
问题 I have a akka-streams topology, where I make a POST call using akka-http. I am getting following error when hitting the post request to a un-secure server(having self-signed certs). It is a internal server, so I am fine from security point of view. javax.net.ssl.SSLHandshakeException: General SSLEngine problem at sun.security.ssl.Handshaker.checkThrown(Handshaker.java:1478) ~[?:1.8.0_131] at sun.security.ssl.SSLEngineImpl.checkTaskThrown(SSLEngineImpl.java:535) ~[?:1.8.0_131] at sun.security

Idiomatic way to use Spark DStream as Source for an Akka stream

﹥>﹥吖頭↗ 提交于 2019-12-19 07:56:29
问题 I'm building a REST API that starts some calculation in a Spark cluster and responds with a chunked stream of the results. Given the Spark stream with calculation results, I can use dstream.foreachRDD() to send the data out of Spark. I'm sending the chunked HTTP response with akka-http: val requestHandler: HttpRequest => HttpResponse = { case HttpRequest(HttpMethods.GET, Uri.Path("/data"), _, _, _) => HttpResponse(entity = HttpEntity.Chunked(ContentTypes.`text/plain`, source)) } For

How to limit an Akka Stream to execute and send down one message only once per second?

女生的网名这么多〃 提交于 2019-12-18 05:56:14
问题 I have an Akka Stream and I want the stream to send messages down stream approximately every second. I tried two ways to solve this problem, the first way was to make the producer at the start of the stream only send messages once every second when a Continue messages comes into this actor. // When receive a Continue message in a ActorPublisher // do work then... if (totalDemand > 0) { import scala.concurrent.duration._ context.system.scheduler.scheduleOnce(1 second, self, Continue) } This

How are reactive streams used in Slick for inserting data

前提是你 提交于 2019-12-18 04:10:44
问题 In Slick's documentation examples for using Reactive Streams are presented just for reading data as a means of a DatabasePublisher. But what happens when you want to use your database as a Sink and backpreasure based on your insertion rate? I've looked for equivalent DatabaseSubscriber but it doesn't exist. So the question is, if I have a Source, say: val source = Source(0 to 100) how can I crete a Sink with Slick that writes those values into a table with schema: create table NumberTable

Conditionally skip flow using akka streams

纵饮孤独 提交于 2019-12-18 02:39:06
问题 I'm using akka streams and I have a segment of my graph that I need to conditionally skip because the flow can't handle certain values. Specifically, I have a flow that takes a string and makes http requests, but the server can't handle the case when the string is empty. But I need to just return an empty string instead. Is there a way of doing this without having to go through the http request knowing it will fail? I basically have this: val source = Source("1", "2", "", "3", "4") val

Why Akka streams cycle doesn't end in this graph?

杀马特。学长 韩版系。学妹 提交于 2019-12-17 21:12:54
问题 I would like to create a graph that loop n times before going to sink. I've just created this sample that fulfill my requirements but doesn't end after going to sink and I really don't understand why. Can someone enlighten me? Thanks. import akka.actor.ActorSystem import akka.stream.scaladsl._ import akka.stream.{ActorMaterializer, UniformFanOutShape} import scala.concurrent.Future object test { def main(args: Array[String]) { val ignore: Sink[Any, Future[Unit]] = Sink.ignore val closed:

Stream records from DataBase using Akka Stream

≯℡__Kan透↙ 提交于 2019-12-17 19:36:50
问题 I have a system using Akka which currently handles incoming streaming data over message queues. When a record arrives then it is processed, mq is acked and record is passed on for further handling within the system. Now I would like to add support for using DBs as input. What would be a way to go for the input source to be able to handle DB (should stream in > 100M records at the pace that the receiver can handle - so I presume reactive/akka-streams?)? 回答1: Slick Library Slick streaming is

Difference between map and mapAsync

一笑奈何 提交于 2019-12-17 18:33:06
问题 Can anyone please explain me difference between map and mapAsync w.r.t AKKA stream? In the documentation it is said that Stream transformations and side effects involving external non-stream based services can be performed with mapAsync or mapAsyncUnordered Why cant we simply us map here? I assume that Flow, Source, Sink all would be Monadic in nature and thus map should work fine w.r.t the Delay in the nature of these ? 回答1: Signature The difference is best highlighted in the signatures:

Akka Streams: What does Mat represents in Source[out, Mat]

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-17 16:27:40
问题 In Akka streams what does Mat in Source[Out, Mat] or Sink[In, Mat] represent. When will it actually be used? 回答1: The Mat type parameter represents the type of the materialized value of this stream. Remember that in Akka Source , Flow , Sink (well, all graphs) are just blueprints - they do not do any processing by themselves, they only describe how the stream should be constructed. The process of turning these blueprints into a working stream with live data is called materialization . The