scala

Scala : Read Array value in Elasticsearch with Spark

纵然是瞬间 提交于 2020-08-26 09:51:05
问题 I am trying to read datas from Elasticsearch, but the document I want to read contains a nested array (that I want to read). I included the option "es.read.field.as.array.include" in the following way : val dataframe = reader .option("es.read.field.as.array.include","arrayField") .option("es.query", "someQuery") .load("Index/Document") But got the error java.lang.ClassCastException: scala.collection.convert.Wrappers$JListWrapper cannot be cast to java.lang.Float How should I map my array to

scala-js “@JSGlobalScope” error when migrating to scala-js 1.1.1

核能气质少年 提交于 2020-08-26 07:18:08
问题 I had a fallowing snippet of code in scala-js(0.6.33) object Main2 extends App { val js = for { jsTest <- JSTest.js1.toOption } yield jsTest println(JSTest.js1) } import scala.scalajs.js import scala.scalajs.js.annotation.JSGlobalScope @js.native @JSGlobalScope object JSTest extends js.Object { def js1: js.UndefOr[JS2] = js.native } @js.native trait JS1 extends js.Object { def js1: js.UndefOr[JS2] = js.native } @js.native trait JS2 extends js.Object { def js2: js.UndefOr[Int] = js.native }

mock case class in scala : Mockito

天涯浪子 提交于 2020-08-26 03:40:22
问题 In my play application i intend to mock a case class. I am able to do so but it creates an object with all member variables null. Is there a way to create mock objects of a case classes such that the object can have some members initialized? case class User(name: String, address: String) val mockUser = mock[User] user.name // null user.address //null how do i create a mockUser such that i can assign some values to name and address? Edit: I need the ability to mock the object because i want to

mock case class in scala : Mockito

£可爱£侵袭症+ 提交于 2020-08-26 03:39:08
问题 In my play application i intend to mock a case class. I am able to do so but it creates an object with all member variables null. Is there a way to create mock objects of a case classes such that the object can have some members initialized? case class User(name: String, address: String) val mockUser = mock[User] user.name // null user.address //null how do i create a mockUser such that i can assign some values to name and address? Edit: I need the ability to mock the object because i want to

Joining Two Datasets with Predicate Pushdown

China☆狼群 提交于 2020-08-25 04:04:09
问题 I have a Dataset that i created from a RDD and try to join it with another Dataset which is created from my Phoenix Table : val dfToJoin = sparkSession.createDataset(rddToJoin) val tableDf = sparkSession .read .option("table", "table") .option("zkURL", "localhost") .format("org.apache.phoenix.spark") .load() val joinedDf = dfToJoin.join(tableDf, "columnToJoinOn") When i execute it, it seems that the whole database table is loaded to do the join. Is there a way to do such a join so that the

Joining Two Datasets with Predicate Pushdown

浪子不回头ぞ 提交于 2020-08-25 04:03:06
问题 I have a Dataset that i created from a RDD and try to join it with another Dataset which is created from my Phoenix Table : val dfToJoin = sparkSession.createDataset(rddToJoin) val tableDf = sparkSession .read .option("table", "table") .option("zkURL", "localhost") .format("org.apache.phoenix.spark") .load() val joinedDf = dfToJoin.join(tableDf, "columnToJoinOn") When i execute it, it seems that the whole database table is loaded to do the join. Is there a way to do such a join so that the

Joining Two Datasets with Predicate Pushdown

丶灬走出姿态 提交于 2020-08-25 04:02:24
问题 I have a Dataset that i created from a RDD and try to join it with another Dataset which is created from my Phoenix Table : val dfToJoin = sparkSession.createDataset(rddToJoin) val tableDf = sparkSession .read .option("table", "table") .option("zkURL", "localhost") .format("org.apache.phoenix.spark") .load() val joinedDf = dfToJoin.join(tableDf, "columnToJoinOn") When i execute it, it seems that the whole database table is loaded to do the join. Is there a way to do such a join so that the