New to spark world and trying a dataset example written in scala that I found online
On running it through SBT , i keep on getting the following error
This line is what is causing the problem :
org.apache.spark.sql.catalyst.encoders.OuterScopes.addOuterScope(this)
This means that you are adding a new outer scope to this context that can be used when instantiating an inner class
during deserialization.
Inner classes are created when a case class is defined in the Spark REPL and registering the outer scope that this class was defined in allows us to create new instances on the spark executors.
In normal use (your case), you shouldn't need to call this function.
EDIT: You'll also need to move your case classes outside of the DatasetExample
object.
Note:
import sqlContext.implicits._
is a scala-specific call for implicit methods available for converting common scala RDD objects into DataFrames.
More on that here.