Using Spark 2.0, Im seeing that it is possible to turn a dataframe of row\'s into a dataframe of case classes. When I try to do so, Im greeted with a message stating to impo
Spark used spark
identifier for SparkSession. This is what causes the confusion. If you created it with something like,
val ss = SparkSession
.builder()
.appName("test")
.master("local[2]")
.getOrCreate()
The correct way to import implicits
would be,
import ss.implicits._
Let me know if this helps. Cheers.