akka-stream

use a partition to decide if a flow should go to next flow and if not have the stream pick it up next tick from the beginning (Akka)

若如初见. 提交于 2020-12-13 04:57:04
问题 I want to create a Flow that have 4 main steps (FlowShapes), and after the first and the second I want to have partition that will decide if there is a reason to go to the next, and if not to sink it so the stream will pick it up later and start from beginning, but im not sure this is the way, cause I just used Sink.ignore, it looks like this: def mainFlow: Flow[MyGraphElement, MyGraphElement, NotUsed] = Flow.fromGraph(GraphDSL.create() { implicit builder => // FlowShape's // those flow

How to write CSV file with headers using akka stream alpakka?

…衆ロ難τιáo~ 提交于 2020-08-26 08:01:09
问题 I can't see to find it, hence i turn to slack to ask: Is there a way to write a csv file with its heards using akka stream alpakka ? The only thing i see is https://doc.akka.io/docs/alpakka/current/data-transformations/csv.html#csv-formatting But no reverse operation to csv to map somehow. My use case is that i need to read few csv files, filter their content, and write the clean content in a corresponding file orginalcsvfilename-cleanded.csv. If it is not directly supported, any

How does Akka Stream's Keep right/left/both result in a different output?

廉价感情. 提交于 2020-08-10 20:20:26
问题 I am trying to wrap my head around how Keep works in Akka streams. Reading answers in What does Keep in akka stream mean, I understand that it helps to control we get the result from the left/right/both sides of the materializer. However, I still can't build an example were I can change the value of left/right and get different results. For example, implicit val system: ActorSystem = ActorSystem("Playground") implicit val materializer: ActorMaterializer = ActorMaterializer() val

What does Keep in akka stream mean?

纵然是瞬间 提交于 2020-07-08 10:32:14
问题 I am learning akka stream and encounter Keep.left and Keep.right in the code: implicit val system = ActorSystem("KafkaProducer") implicit val materializer = ActorMaterializer() val source = Source(List("a", "b", "c")) val sink = Sink.fold[String, String]("")(_ + _) val runnable: RunnableGraph[Future[String]] = source.toMat(sink)(Keep.right) val result: Future[String] = runnable.run() What does here Keep.right mean? 回答1: Every stream processing stage can produce a materialized value which can

How to clean up substreams in continuous Akka streams

泄露秘密 提交于 2020-06-25 09:18:14
问题 Given I have a very long running stream of events flowing through something as show below. When a long time has passed there will be lots of sub streams created that is no longer needed. Is there a way to clean up a specific substream at a given time, for example the substream created by id 3 should be cleaned and the state in the scan method lost at 13Pm (expires property of Wid)? case class Wid(id: Int, v: String, expires: LocalDateTime) test("Substream with scan") { val (pub, sub) =

Akka Stream Kafka vs Kafka Streams

牧云@^-^@ 提交于 2020-05-09 17:57:05
问题 I am currently working with Akka Stream Kafka to interact with kafka and I was wonderings what were the differences with Kafka Streams. I know that the Akka based approach implements the reactive specifications and handles back-pressure, functionality that kafka streams seems to be lacking. What would be the advantage of using kafka streams over akka streams kafka? 回答1: Your question is very general, so I'll give a general answer from my point of view. First, I've got two usage scenario:

Migrating Play framework 2.4 chunked response to play 2.5/2.6 akka stream Source

﹥>﹥吖頭↗ 提交于 2020-01-25 08:13:33
问题 My goal is to migrate from java play 2.4 Chunked response to akka source. Essentially: public Result getDCcsv() { response().setHeader("Content-Type", "text/csv"); response().setHeader("Content-Disposition", "attachment;filename=users.csv"); response().setHeader("Cache-control", "private"); return ok(new Results.Chunks<String>() { public void onReady(Results.Chunks.Out<String> out) { //blocking calls to read data from DB in manageable chunks out.write(data) } }); } to public Result getDCcsv()

How to create an akka-stream Source from a Flow that generate values recursively?

最后都变了- 提交于 2020-01-22 13:12:27
问题 I need to traverse an API that is shaped like a tree. For example, a directory structure or threads of discussion. It can be modeled via the following flow: type ItemId = Int type Data = String case class Item(data: Data, kids: List[ItemId]) def randomData(): Data = scala.util.Random.alphanumeric.take(2).mkString // 0 => [1, 9] // 1 => [10, 19] // 2 => [20, 29] // ... // 9 => [90, 99] // _ => [] // NB. I don't have access to this function, only the itemFlow. def nested(id: ItemId): List

How to create an akka-stream Source from a Flow that generate values recursively?

十年热恋 提交于 2020-01-22 13:12:07
问题 I need to traverse an API that is shaped like a tree. For example, a directory structure or threads of discussion. It can be modeled via the following flow: type ItemId = Int type Data = String case class Item(data: Data, kids: List[ItemId]) def randomData(): Data = scala.util.Random.alphanumeric.take(2).mkString // 0 => [1, 9] // 1 => [10, 19] // 2 => [20, 29] // ... // 9 => [90, 99] // _ => [] // NB. I don't have access to this function, only the itemFlow. def nested(id: ItemId): List