java.lang.NoSuchMethodError: scala.Predef$.refArrayOps

前端 未结 10 940
刺人心
刺人心 2020-12-01 13:59

I have the following class:

import scala.util.{Success, Failure, Try}


class MyClass {

  def openFile(fileName: String): Try[String]  = {
    Failure( new          


        
相关标签:
10条回答
  • 2020-12-01 14:04

    I used IntelliJ, and just import the project again. I mean, close the open project and import in as Maven or SBT. Note: I select the mvn (import Maven projects automatically) It disappeared.

    0 讨论(0)
  • 2020-12-01 14:04

    This was happening to me in DataBricks. Problem was the same as noted by previous answers, the incompatibility with spark and scala version. For DataBricks, I had to change the cluster DataBricks Runtime Version. Default was Scala 2.11/Spark 2.4.5, bump this up to at least Scala 2.12/Spark 3.0.0

    Click Clusters > Cluster_Name > Edit > DataBricks Runtime Version

    0 讨论(0)
  • 2020-12-01 14:08

    When you are using Spark, Hadoop, Scala, and java, some incompatibilities arise. You can use the version of each one that are compatible with others. I use Spark version: 2.4.1 , Hadoop: 2.7 , java: 9.0.1 and Scala: 2.11.12 they are compatible with each other.

    0 讨论(0)
  • 2020-12-01 14:08

    In my case, the Spark version makes it incompatible. Change to Spark 2.4.0 works for me.

    0 讨论(0)
  • 2020-12-01 14:08

    Try adding the following line to your build.sbt

    libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"
    

    your build.sbt should be like this:

    libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1"
    
    libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"
    

    With this, the error for me is solved.

    0 讨论(0)
  • 2020-12-01 14:09

    I had SDK in global libraries with a different version of Scala(IntelliJ IDEA).
    File -> Project Structure -> Global libraries -> Remove SDK -> Rebuild. It fixed the Exception for me.

    0 讨论(0)
提交回复
热议问题