akka-stream

akka-stream + akka-http lifecycle

浪尽此生 提交于 2019-12-05 00:42:27
TLDR: is it better to materialize a stream per request (i.e. use short-lived streams) or to use a single stream materialization across requests, when I have an outgoing http request as a part of the stream? Details: I have a typical service that takes an HTTP request, scatters it to several 3rd party downstream services (not controlled by me) and aggregates the results before sending them back. I'm using akka-http for client implementation and spray for server (legacy, will move to akka-http over time). Schematically: request -> map -1-*-> map -> 3rd party http -> map -*-1> aggregation ->

Get whole HttpResponse body as a String with Akka-Streams HTTP

不想你离开。 提交于 2019-12-04 21:20:12
问题 I'm trying to understand how to use the new akka.http library. I would like to send an http request to a server and read the whole response body as a single String in order to produce a Source[String,?] . Here is the best solution I was able to produce so far: def get( modelID: String, pool: Flow[(HttpRequest,Int),(Try[HttpResponse],Int),Http.HostConnectionPool] ): Source[String,Unit] = { val uri = reactionsURL(modelID) val req = HttpRequest(uri = uri) Source.single( (req,0) ) .via( pool )

Backpressure strategies for Akka Stream Source.queue not working

烈酒焚心 提交于 2019-12-04 17:01:43
I'm trying to understand why the below code snippet is doing what it's doing. I would have thought that because the Sink cannot produce demand faster than the Source is producing content, then I would be getting dropped messages in response to some of the offers (overflow strategy is set to Drop Buffer) and also an error and queue closed message after the self destruct piece. The snippet: package playground import java.time.LocalDateTime import java.util.concurrent.atomic.AtomicInteger import akka.actor.{Actor, ActorLogging, ActorSystem, Props} import akka.stream.QueueOfferResult.{Dropped,

How do I subscribe to a reactive streams implementation running on a different JVM?

China☆狼群 提交于 2019-12-04 15:59:12
Let's assume we have two Akka Stream flows, each running on its own JVM. // A reactive streams publisher running on JVM 1: val stringPublisher: Publisher[String] = Source(() => "Lorem Ipsum".split("\\s").iterator).runWith(Sink.publisher[String]) // A reactive streams subscriber running on JVM 2: def subscriber: Subscriber[String] = Sink.foreach[String](println(_)).runWith(Source.subscriber[String]) // Subscribe the second stream to the first stream stringPublisher.subscribe(subscriber) This example runs fine on one JVM, but how can I subscribe to a publisher running on a different JVM? Do I

Reading a large file using Akka Streams

情到浓时终转凉″ 提交于 2019-12-04 14:32:27
I'm trying out Akka Streams and here is a short snippet that I have: override def main(args: Array[String]) { val filePath = "/Users/joe/Softwares/data/FoodFacts.csv"//args(0) val file = new File(filePath) println(file.getAbsolutePath) // read 1MB of file as a stream val fileSource = SynchronousFileSource(file, 1 * 1024 * 1024) val shaFlow = fileSource.map(chunk => { println(s"the string obtained is ${chunk.toString}") }) shaFlow.to(Sink.foreach(println(_))).run // fails with a null pointer def sha256(s: String) = { val messageDigest = MessageDigest.getInstance("SHA-256") messageDigest.digest

How to use Akka-HTTP client websocket send message

℡╲_俬逩灬. 提交于 2019-12-04 14:07:17
问题 I'm trying client-side websocket by following doc at webSocketClientFlow. sample code is: import akka.actor.ActorSystem import akka.Done import akka.http.scaladsl.Http import akka.stream.ActorMaterializer import akka.stream.scaladsl._ import akka.http.scaladsl.model._ import akka.http.scaladsl.model.ws._ import scala.concurrent.Future object WebSocketClientFlow { def main(args: Array[String]) = { implicit val system = ActorSystem() implicit val materializer = ActorMaterializer() import system

End-to-End Reactive Streaming RESTful service (a.k.a. Back-Pressure over HTTP)

廉价感情. 提交于 2019-12-04 13:35:59
问题 I have been trying to clarify this question online for a while without success, so I will try to ask it here. I would like to find some resource or example where it shows how I can build an end-to-end fully back-pressured REST service + client. What I mean is that I would like to see that, given a REST client that implements Reactive Streams (whether in Akka, JS, or whatever), I will have (and be able to "visualise") the back-pressure handled throughout a REST server built, e.g. with Akka

How do you throttle Flow in the latest Akka (2.4.6)?

守給你的承諾、 提交于 2019-12-04 10:49:26
How do you throttle Flow in the latest Akka (2.4.6) ? I'd like to throttle Http client flow to limit number of requests to 3 requests per second. I found following example online but it's for old Akka and akka-streams API changed so much that I can't figure out how to rewrite it. def throttled[T](rate: FiniteDuration): Flow[T, T] = { val tickSource: Source[Unit] = TickSource(rate, rate, () => ()) val zip = Zip[T, Unit] val in = UndefinedSource[T] val out = UndefinedSink[T] PartialFlowGraph { implicit builder => import FlowGraphImplicits._ in ~> zip.left ~> Flow[(T, Unit)].map { case (t, _) =>

Reading a CSV files using Akka Streams

耗尽温柔 提交于 2019-12-04 10:29:05
问题 I'm reading a csv file. I am using Akka Streams to do this so that I can create a graph of actions to perform on each line. I've got the following toy example up and running. def main(args: Array[String]): Unit = { implicit val system = ActorSystem("MyAkkaSystem") implicit val materializer = ActorMaterializer() val source = akka.stream.scaladsl.Source.fromIterator(Source.fromFile("a.csv").getLines) val sink = Sink.foreach(println) source.runWith(sink) } The two Source types don't sit easy

How can Akka streams be materialized continually?

无人久伴 提交于 2019-12-04 07:39:24
I am using Akka Streams in Scala to poll from an AWS SQS queue using the AWS Java SDK . I created an ActorPublisher which dequeues messages on a two second interval: class SQSSubscriber(name: String) extends ActorPublisher[Message] { implicit val materializer = ActorMaterializer() val schedule = context.system.scheduler.schedule(0 seconds, 2 seconds, self, "dequeue") val client = new AmazonSQSClient() client.setRegion(RegionUtils.getRegion("us-east-1")) val url = client.getQueueUrl(name).getQueueUrl val MaxBufferSize = 100 var buf = Vector.empty[Message] override def receive: Receive = { case