I have the following class:
import scala.util.{Success, Failure, Try}
class MyClass {
def openFile(fileName: String): Try[String] = {
Failure( new
I used IntelliJ, and just import the project again. I mean, close the open project and import in as Maven or SBT. Note: I select the mvn (import Maven projects automatically) It disappeared.
This was happening to me in DataBricks. Problem was the same as noted by previous answers, the incompatibility with spark and scala version. For DataBricks, I had to change the cluster DataBricks Runtime Version. Default was Scala 2.11/Spark 2.4.5, bump this up to at least Scala 2.12/Spark 3.0.0
Click Clusters > Cluster_Name > Edit > DataBricks Runtime Version
When you are using Spark, Hadoop, Scala, and java, some incompatibilities arise. You can use the version of each one that are compatible with others. I use Spark version: 2.4.1 , Hadoop: 2.7 , java: 9.0.1 and Scala: 2.11.12 they are compatible with each other.
In my case, the Spark version makes it incompatible. Change to Spark 2.4.0 works for me.
Try adding the following line to your build.sbt
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"
your build.sbt should be like this:
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"
With this, the error for me is solved.
I had SDK in global libraries with a different version of Scala(IntelliJ IDEA).
File -> Project Structure -> Global libraries -> Remove SDK -> Rebuild. It fixed the Exception for me.