akka-stream

Why does this akka-http route test never complete successfully?

血红的双手。 提交于 2019-12-11 14:46:27
问题 I've a simple route and some tests that success individually, but collectively fail with timeout. Any idea why? val route = (requestHandler: ActorRef @@ Web) => { get { pathPrefix("apps") { pathEndOrSingleSlash { completeWith(implicitly[ToEntityMarshaller[List[String]]]) { callback => requestHandler ! GetAppsRequest(callback) } } ~ path("stats") { completeWith(implicitly[ToEntityMarshaller[List[Stats]]]) { callback => requestHandler ! GetStatsRequest(callback) } } } ~ path("apps" / Segment /

Akka stream map with retry

北城以北 提交于 2019-12-11 11:27:11
问题 How can i retry if mapping/processing of an element in the stream fails? I have tried setting the decider in materializer but it does not offer retry. It simply maps exception to Supervision stage. Thanks 回答1: Consider it as a tip rather than complete answer. I recently implement similar functionality with futures and mapAsync . There is a library called retry for retrying futures with different strategies such as pause or back off. But this method have a problem that because akka streams use

Selective request-throttling using akka-http stream

痞子三分冷 提交于 2019-12-11 06:45:28
问题 I got one API which calls two another Downstream APIs. One downstream api ( https://test/foo ) is really important and it is very fast. Another slow downstream api ( https://test/bar ) has its limitation, the throughput of it only can handle 50 requests per sec. I would like to make sure the downstream api https://test/foo has more priority than https://test/bar . For example, if the API thread pool is 75, I only allow 50 parallel incoming connection to go through https://test/bar . Rest of

Consuming both Strict and Streamed WebSocket Messages in Akka

前提是你 提交于 2019-12-11 05:00:47
问题 I am experimenting with building a web socket service using Akka HTTP. I need to handle Strict messages that arrive in totality, as well as handle Streamed messages that arrive in m multiple frames. I am using a route with handleWebSocketMessages() to pass the handling of web sockets off to a flow. The code I have looks something like this: val route: Route = get { handleWebSocketMessages(createFlow()) } def createFlow(): Flow[Message, Message, Any] = Flow[Message] .collect { case TextMessage

Akka Stream Graph recover issue

て烟熏妆下的殇ゞ 提交于 2019-12-11 03:09:03
问题 I have created a graph to paralellise two flows with the same input. The flows produce Future[Option[Entity]]. If flowA fails I would like to return a Future[None] but the recover does not seem to be called val graph: Flow[Input, (Future[Option[Entity]], Future[Option[Entity]]), NotUsed] = Flow.fromGraph(GraphDSL.create() { implicit builder => import GraphDSL.Implicits._ val broadcast = builder.add(Broadcast[Input](2)) val zip = builder.add(Zip[Future[Option[Entity]], Future[Option[Entity]]])

How does one close an Akka stream?

旧时模样 提交于 2019-12-11 01:29:00
问题 I am updating some experimental spray code to akka-stream (2.0.2), so I'm going through he documentation little by little. One of the needs I have is that if I detect a protocol violation in my stream, I need to close the stream immediately and kick off the client. What is the proper way of immediately closing (terminating) the stream from inside a Flow? 回答1: Use a PushStage : import akka.stream.stage._ val closeStage = new PushStage[Tpe, Tpe] { override def onPush(elem: Tpe, ctx: Context[Tpe

How to stream zipped file (on the fly) via Play Framework 2.5 in scala?

流过昼夜 提交于 2019-12-10 18:09:12
问题 I want to stream some files and zip them on the fly, so users can download multiple files into a single zipped file without writing anything to the local disk. However, my current implementation holds everything in the memory, and will no work for large files. Is there any way to fix it? I was looking at this implementation: https://gist.github.com/kirked/03c7f111de0e9a1f74377bf95d3f0f60, but couldn't figure out how to use it. import java.io.{BufferedOutputStream, ByteArrayInputStream,

akka-streams with akka-cluster

让人想犯罪 __ 提交于 2019-12-10 17:33:18
问题 My akka-streams learn-o-thon continues. I'd like to integrate my akka-streams application with akka-cluster and DistributedPubSubMediator. Adding support for Publish is fairly straight forward, but the Subscribe part I'm having trouble with. For reference, a subscriber is given as follows in the Typesafe sample: class ChatClient(name: String) extends Actor { val mediator = DistributedPubSub(context.system).mediator mediator ! Subscribe("some topic", self) def receive = { case ChatClient

Is groupBy leaking in akka-stream?

☆樱花仙子☆ 提交于 2019-12-10 15:55:54
问题 I want to write a flow on akka-stream for grouping events from infinite stream by session_uid and calculate sum of traffic for each session (details in my previous question). I am going to use Source#groupBy function for group events by session_uid but seems like this function accumulate all group keys inside and don't have a way to release them. This is caused java.lang.OutOfMemoryError: Java heap space exception. Here is code for reproduce it: import akka.actor.ActorSystem import akka

Converting a callback-method implementation into an akka stream Source

China☆狼群 提交于 2019-12-10 13:52:38
问题 I am working with a data publisher from a java library that I do not control. The publisher library uses a typical callback setup; somewhere in the library code (the library is java but I will describe in scala for terseness): type DataType = ??? trait DataConsumer { def onData(data : DataType) : Unit } The user of the library is required to write a class that implements the onData method and pass that into a DataProducer , the library code looks something like: class DataProducer(consumer :