akka-stream

Column family ID mismatch (found 52ac10b0-6e1b-11e7-82d3-c39cc53c1347; expected 42e51050-6e1b-11e7-82d3-c39cc53c1347)

∥☆過路亽.° 提交于 2019-12-13 22:07:49
问题 am running the Lagom online-auction-java and am having the error , when running below when i execute sbt clean runAll ERROR [Native-Transport-Requests-13] 2017-07-21 16:52:45,704 ErrorMessage.java:384 - Unexpected exception during request java.lang.RuntimeException: java.util.concurrent.ExecutionException: org.apache.cassandra.exceptions.ConfigurationException: Column family ID mismatch (found 52ac10b0-6e1b-11e7-82d3-c39cc53c1347; expected 42e51050-6e1b-11e7-82d3-c39cc53c1347) at org.apache

How to debug akka streams flows?

南楼画角 提交于 2019-12-13 19:28:10
问题 When I drop a breakpoint somewhere in method processLine, debugger does not stop at the line. It executes as if there is not any breakpoint .Is debugging akka streams flows somewhat different, how can i solve this issue? val stream = source. map( csvLine => A.processLine(csvLine)). runWith(Sink) 回答1: I have had similar issues with ScalaIDE. My solution has generally been to isolate my "business logic" from any akka dependencies: //no akka imports required case class Tweet(val author : String,

What's the difference between Supervision.Restart and Supervision.Resume really?

核能气质少年 提交于 2019-12-13 07:05:51
问题 what's the difference between Supervision.Restart and Supervision.Resume really ? Here is the situation. I have 3 elements coming from the Source(List(1, 2, 3)) . In runForeach I throw exception if element is 2 . For Supervision.Restart I expected only 1 to be processed. But oddly I see 3 reaching sink. Why ? I'm using Akka 2.4.11 import akka.actor.{ActorRef, ActorSystem} import akka.stream.{ActorMaterializer, ActorMaterializerSettings, OverflowStrategy, Supervision} import akka.stream

Triggering on FileIO.toPath sink completion

只谈情不闲聊 提交于 2019-12-13 05:11:08
问题 I have an IO Source I'm multiplexing a file and handing off to another part of the system to execute like: source.alsoTo(FileIO.toPath(path)) This is used for a cache, i.e. i'm writing a file passing through my system to a temporary cache location. Once the cache is written, I want to move it atomically to its final location. The consumer of the source gets a Future and knows when it is done, but my side-channel writing to file to disk has no such facility that I can find. Is there some way

Akka.Net Streams and remoting (Sink.ActorRefWithAck)

我与影子孤独终老i 提交于 2019-12-13 04:00:52
问题 I've made quite a simple implementation with Akka.net Streams using Sink.ActorRefWithAck : a subscriber asks for a large string to a publisher which sends it by slices. It works perfectly fine locally (UT) but not remotely . And I cannot understand what's wrong? Concretly: the subscriber is able to send the request to the publisher which responds with an OnInit message but then the OnInit.Ack will never goes back to the publisher. This Ack message ends up as a dead letter : INFO Akka.Actor

What is the purpose of the composite flow(from Sink and Source)?

大城市里の小女人 提交于 2019-12-13 03:59:14
问题 I am trying the understand composite flow (from Sink and Source) from the website and they represent as the following: Could someone please provide an example for the usage of composite flow. And when should I use it? 回答1: Flow.fromSinkAndSource provides a convenient way to assemble a flow composed with a sink as its input and a source as its output that are not connected, which can be best illustrated with the following diagram (available in the API link): +----------------------------------

using akka streams to go over mongo collection

ぃ、小莉子 提交于 2019-12-13 03:58:22
问题 I have a collection of people in mongo, and I want to go over each person in the collection as a stream, and for each person call a method that is performing api call, changing the model, and inserting to a new collection in mongo. It looks like this: def processPeople()(implicit m: Materializer): Future[Unit] = { val peopleSource: Source[Person, Future[State]] = collection.find(json()).cursor[Person]().documentSource() peopleSource.runWith(Sink.seq[Person]).map(people => { people.foreach

Akka Streams Covarience for SourceQueueWithComplete

扶醉桌前 提交于 2019-12-13 03:47:02
问题 I have an Actor which is created from within a SupervisorActor and this Actor is responsible for pushing the messages that it gets into a stream. Here is the Actor: class KafkaPublisher[T <: KafkaMessage: ClassTag] extends Actor { implicit val system = context.system val log = Logging(system, this.getClass.getName) override final def receive = { case ProducerStreamActivated(_, stream: SourceQueueWithComplete[T]) => log.info(s"Activated stream for Kafka Producer with ActorName >> ${self.path

What is the passthrough used for in alpakka-kafka connector while producing messages?

孤街浪徒 提交于 2019-12-13 02:55:53
问题 This is the code for producing a single message to Kafka given in the doc https://doc.akka.io/docs/alpakka-kafka/current/producer.html val single: ProducerMessage.Envelope[KeyType, ValueType, PassThroughType] = ProducerMessage.single( new ProducerRecord("topicName", key, value), passThrough ) Could you please explain what is passThrough used for? 回答1: passThrough is an additional value which comes to be available for use in ProducerMessage.Results ’s passThrough() . As the official

How do I get instances of connected Inlet and Outlet in FlowShape?

依然范特西╮ 提交于 2019-12-13 02:21:22
问题 How do I get instances of connected Inlet and Outlet in FlowShape ? Consider following example def throttleFlow[T](rate: FiniteDuration) = Flow.fromGraph(GraphDSL.create() { implicit builder => import GraphDSL.Implicits._ val ticker = Source.tick(rate, rate, Unit) val zip = builder.add(Zip[T, Unit.type]) val map = Flow[(T, Unit.type)].map { case (value, _) => value } val messageExtractor = builder.add(map) val in = Inlet[T]("Req.in") val out = Outlet[T]("Req.out") out ~> zip.in0 ticker ~> zip