IntelliJ Idea 14: cannot resolve symbol spark

岁酱吖の 提交于 2020-01-01 03:18:04

问题


I made a dependency of Spark which worked in my first project. But when I try to make a new project with Spark, my SBT does not import the external jars of org.apache.spark. Therefore IntelliJ Idea gives the error that it "cannot resolve symbol". I already tried to make a new project from scratch and use auto-import but none works. When I try to compile I get the messages that "object apache is not a member of package org". My build.sbt looks like this:

name := "hello"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-parent_2.10" % "1.4.1"

I have the impression that there might be something wrong with my SBT settings, although it already worked one time. And except for the external libraries everything is the same... I also tried to import the pom.xml file of my spark dependency but that also doesn't work. Thank you in advance!


回答1:


This worked for me->

name := "ProjectName"
version := "0.1"
scalaVersion := "2.11.11"

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "2.2.0",
  "org.apache.spark" % "spark-sql_2.11" % "2.2.0",
  "org.apache.spark" % "spark-mllib_2.10" % "1.1.0"
)



回答2:


I use

scalaVersion := "2.11.7"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.1"

in my build.sbt and it works for me.




回答3:


I had a similar problem. It seems the reason was that the build.sbt file was specifying the wrong version of scala.

If you run spark-shell it'll say at some point the scala version used by Spark, e.g.

Using Scala version 2.11.8

Then I edited the line in the build.sbt file to point to that version and it worked.




回答4:


Currently spark-cassandra-connector compatible with Scala 2.10 and 2.11.

It worked for me when I updated the scala version of my project like below:

ThisBuild / scalaVersion := "2.11.12"

and I updated my dependency like:

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.4.0",

If you use "%%", sbt will add your project’s binary Scala version to the artifact name.

From sbt run:

sbt> reload
sbt> compile



回答5:


Your library dependecy conflicts with with the scala version you're using, you need to use 2.11 for it to work. The correct dependency would be:

scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.4.1"

note that you need to change spark_parent to spark_core




回答6:


name := "SparkLearning"

version := "0.1"

scalaVersion := "2.12.3"

// additional libraries libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"



来源:https://stackoverflow.com/questions/32265343/intellij-idea-14-cannot-resolve-symbol-spark

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!