How to generate a big data stream on the fly

╄→гoц情女王★ 提交于 2019-12-05 11:19:17

There's a note in the docs for Enumerator.outputStream:

Not [sic!] that calls to write will not block, so if the iteratee that is being fed to is slow to consume the input, the OutputStream will not push back. This means it should not be used with large streams since there is a risk of running out of memory.

If this can happen depends on your situation. If you can and will generate Gigabytes in seconds, you should probably try something different. I'm not exactly sure what, but I'd start at Enumerator.generateM(). For many cases though, your method is perfectly fine. Have a look at this example by Gaëtan Renaudeau for serving a Zip file that's generated on the fly in the same way you're using it:

val enumerator = Enumerator.outputStream { os =>
  val zip = new ZipOutputStream(os);
  Range(0, 100).map { i =>
    zip.putNextEntry(new ZipEntry("test-zip/README-"+i+".txt"))
    zip.write("Here are 100000 random numbers:\n".map(_.toByte).toArray)
    // Let's do 100 writes of 1'000 numbers
    Range(0, 100).map { j =>
      zip.write((Range(0, 1000).map(_=>r.nextLong).map(_.toString).mkString("\n")).map(_.toByte).toArray);
    }
    zip.closeEntry()
  }
  zip.close()
}
Ok.stream(enumerator >>> Enumerator.eof).withHeaders(
  "Content-Type"->"application/zip", 
  "Content-Disposition"->"attachment; filename=test.zip"
)

Please keep in mind that Ok.stream has been replaced by Ok.chunked in newer versions of Play, in case you want to upgrade.

As for the chunk size, you can always use Enumeratee.grouped to gather a bunch of values and send them as one chunk.

val grouper = Enumeratee.grouped(  
  Traversable.take[Array[Double]](100) &>> Iteratee.consume()  
)

Then you'd do something like

Ok.stream(enumerator &> grouper >>> Enumerator.eof)
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!