SparkContext Not Being Initialized after sbt run

試著忘記壹切 提交于 2019-12-24 13:21:04

问题


I have my build.sbt file below:

name := "hello"

version := "1.0"

scalaVersion := "2.11.8"

val sparkVersion = "1.6.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-streaming" % sparkVersion,
  "org.apache.spark" %% "spark-streaming-twitter" % sparkVersion
)

I also have example.scala in src/main/scala/example.scala:

import org.apache.spark._
import org.apache.spark.SparkContext._

object WordCount {
    def main(args: Array[String]) {
      val conf = new SparkConf().setAppName("wordCount").setMaster("local")
      val sc = new SparkContext(conf)
      val input =  sc.textFile("food.txt")
      val words = input.flatMap(line => line.split(" "))
      val counts = words.map(word => (word, 1)).reduceByKey{case (x, y) => x + y}
      counts.saveAsTextFile("output.txt")
    }
}

For some reason when I do sbt run in my root directory (not src/main/scala) I get the error:

[info] Running WordCount 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/06/21 22:05:08 INFO SparkContext: Running Spark version 1.6.1
16/06/21 22:05:08 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/21 22:05:09 ERROR SparkContext: Error initializing SparkContext.
java.net.UnknownHostException: LM-SFA-11002982: LM-SFA-11002982: nodename nor servname provided, or not known
    at java.net.InetAddress.getLocalHost(InetAddress.java:1475)
    at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:788)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:781)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:781)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:838)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:838)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.util.Utils$.localHostName(Utils.scala:838)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:420)
    at WordCount$.main(exam.scala:8)
    at WordCount.main(exam.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at sbt.Run.invokeMain(Run.scala:67)
    at sbt.Run.run0(Run.scala:61)
    at sbt.Run.sbt$Run$$execute$1(Run.scala:51)
    at sbt.Run$$anonfun$run$1.apply$mcV$sp(Run.scala:55)
    at sbt.Run$$anonfun$run$1.apply(Run.scala:55)
    at sbt.Run$$anonfun$run$1.apply(Run.scala:55)
    at sbt.Logger$$anon$4.apply(Logger.scala:84)
    at sbt.TrapExit$App.run(TrapExit.scala:248)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.UnknownHostException: LM-SFA-11002982: nodename nor servname provided, or not known
    at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
    at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:901)
    at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1295)
    at java.net.InetAddress.getLocalHost(InetAddress.java:1471)
    ... 23 more
16/06/21 22:05:09 INFO SparkContext: Successfully stopped SparkContext

Can someone please explain to me the problem stated in this error? Is this because my dependencies were not installed correctly or is it because of another reason?


回答1:


It looks like the hostname of your system cannot be resolved to an IP address.

As a [ pretty lame ] workaround you can try:

echo "127.0.0.1 LM-SFA-11002982" | sudo tee -a /etc/hosts


来源:https://stackoverflow.com/questions/37959082/sparkcontext-not-being-initialized-after-sbt-run

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!