Iterate over lines in a file in parallel (Scala)?

后端 未结 5 1541
遥遥无期
遥遥无期 2020-12-12 22:33

I know about the parallel collections in Scala. They are handy! However, I would like to iterate over the lines of a file that is too large for memory in parallel. I coul

相关标签:
5条回答
  • 2020-12-12 22:58

    I'll put this as a separate answer since it's fundamentally different from my last one (and it actually works)

    Here's an outline for a solution using actors, which is basically what Kim Stebel's comment describes. There are two actor classes, a single FileReader actor that reads individual lines from the file on demand, and several Worker actors. The workers all send requests for lines to the reader, and process lines in parallel as they are read from the file.

    I'm using Akka actors here but using another implementation is basically the same idea.

    case object LineRequest
    case object BeginProcessing
    
    class FileReader extends Actor {
    
      //reads a single line from the file or returns None if EOF
      def getLine:Option[String] = ...
    
      def receive = {
        case LineRequest => self.sender.foreach{_ ! getLine} //sender is an Option[ActorRef]
      }
    }
    
    class Worker(reader: ActorRef) extends Actor {
    
      def process(line:String) ...
    
      def receive = {
        case BeginProcessing => reader ! LineRequest
        case Some(line) => {
          process(line)
          reader ! LineRequest
        }
        case None => self.stop
      }
    }
    
    val reader = actorOf[FileReader].start    
    val workers = Vector.fill(4)(actorOf(new Worker(reader)).start)
    workers.foreach{_ ! BeginProcessing}
    //wait for the workers to stop...
    

    This way, no more than 4 (or however many workers you have) unprocessed lines are in memory at a time.

    0 讨论(0)
  • 2020-12-12 23:05

    The comments on Dan Simon's answer got me thinking. Why don't we try wrapping the Source in a Stream:

    def src(source: Source) = Stream[String] = {
      if (source.hasNext) Stream.cons(source.takeWhile( _ != '\n' ).mkString)
      else Stream.empty
    }
    

    Then you could consume it in parallel like this:

    src(Source.fromFile(path)).par foreach process
    

    I tried this out, and it compiles and runs at any rate. I'm not honestly sure if it's loading the whole file into memory or not, but I don't think it is.

    0 讨论(0)
  • 2020-12-12 23:09

    Below helped me to achieve

    source.getLines.toStream.par.foreach( line => println(line))
    
    0 讨论(0)
  • 2020-12-12 23:19

    I realize this is an old question, but you may find the ParIterator implementation in the iterata library to be a useful no-assembly-required implementation of this:

    scala> import com.timgroup.iterata.ParIterator.Implicits._
    scala> val it = (1 to 100000).toIterator.par().map(n => (n + 1, Thread.currentThread.getId))
    scala> it.map(_._2).toSet.size
    res2: Int = 8 // addition was distributed over 8 threads
    
    0 讨论(0)
  • 2020-12-12 23:20

    You could use grouping to easily slice the iterator into chunks you can load into memory and then process in parallel.

    val chunkSize = 128 * 1024
    val iterator = Source.fromFile(path).getLines.grouped(chunkSize)
    iterator.foreach { lines => 
        lines.par.foreach { line => process(line) }
    }
    

    In my opinion, something like this is the simplest way to do it.

    0 讨论(0)
提交回复
热议问题