Zeppelin with Spark interpreter ignores imports declared outside of class/function definition

谁都会走 提交于 2019-12-08 06:50:50

问题


I'm trying to use some Scala code in Zeppelin 0.8.0 with Spark interpreter:

%spark
import scala.beans.BeanProperty

class Node(@BeanProperty val parent: Option[Node]) {
}

But imports do not seem to be taken into account

import scala.beans.BeanProperty
<console>:14: error: not found: type BeanProperty
                  @BeanProperty val parent: Option[Node]) {
                   ^

EDIT: I found out that the following code works :

class Node(@scala.beans.BeanProperty val parent: Option[Node]) {
}

This also works fine :

def loadCsv(CSVPATH: String): DataFrame = {
    import org.apache.spark.sql.types._
    //[...] some code
    val schema = StructType(
        firstRow.map(s => StructField(s, StringType))
    )
    //[…] some code again
}

So I guess everything works fine if it is imported between braces or directly specified with a path.to.package.Class when used.

QUESTION: How do I import outside of a class/function definition?


回答1:


Importing by path.to.package.Class works well in Zeppelin. You can try it with importing and using java.sql.Date;

import java.sql.Date
val date = Date.valueOf("2019-01-01")

The problem is about Zeppelin context. If you try to use following code snippets in Zeppelin, you will see that it works fine;

object TestImport {
     import scala.beans.BeanProperty
     class Node(@BeanProperty val parent: Option[Node]){}
}
val testObj = new TestImport.Node(None)
testObj.getParent
//prints Option[Node] = None

I hope it helps!



来源:https://stackoverflow.com/questions/52168304/zeppelin-with-spark-interpreter-ignores-imports-declared-outside-of-class-functi

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!