Reading a large file using Akka Streams

情到浓时终转凉″ 提交于 2019-12-04 14:32:27

Well 1.0 is deprecated. And if you can, use 2.x.

When I try with 2.0.1 version by using FileIO.fromFile(file) instead of SynchronousFileSource, it is a compile failure with message fails with null pointer. This was simply because it didnt have ActorMaterializer in scope. Including it, makes it work:

object TImpl extends App {
import java.io.File

  implicit val system = ActorSystem("Sys")
  implicit val materializer = ActorMaterializer()

  val file = new File("somefile.csv")
  val fileSource = FileIO.fromFile(file,1 * 1024 * 1024 )
  val shaFlow: Source[String, Future[Long]] = fileSource.map(chunk => {
    s"the string obtained is ${chunk.toString()}"
  })

  shaFlow.runForeach(println(_))    
}

This works for file of any size. For more information on configuration of dispatcher, refer here.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!