Spark doesnt print outputs on the console within the map function

前端 未结 2 370
青春惊慌失措
青春惊慌失措 2021-01-20 21:25

I have a simple Spark application running on cluster mode.

val funcGSSNFilterHeader = (x: String) => {
    println(!x.contains(\"servedMSISDN\")   
    !         


        
相关标签:
2条回答
  • 2021-01-20 21:33

    Your code contains driver (main/master) and executors (which runs on the nodes in cluster mode).

    Functions which runs inside a "map" runs on the executors

    i.e. when you are in cluster mode, execution print inside map function will result in print to the nodes console (which you won't see).

    In order to debug a program, you can:

    1. Run the code in "local" mode, and the prints in the "map function" will be printed the console of your "master/main node" as the executors are running on the same machine

    2. Replace "print to console" with save to file / save to elastic / etc


    Note that in addition to the local vs cluster mode - It seems like you have a typo in your code:

    ggsnArrays.foreachRDD(s => {println(x.toString()})
    

    Should be:

    ggsnArrays.foreachRDD(s => {println(x.toString)})
    
    0 讨论(0)
  • 2021-01-20 21:50

    Two possibilities: Your logs are on worker nodes, so you must check worker logs for these log messages. As suggested before, you can run your application in local mode to check logs on your machine. By the way, it's better to use i.e. SLF4j than just println, but I assume it's only for learning :)

    In snippet there is no ssc.start() and ssc.awaitTermination(). Did you run these commands? If not, foreachRDD will not be executed any time. If the example is ok, please add these line at the end of script and try again, but please check worker nodes logs :)

    0 讨论(0)
提交回复
热议问题