Application report for application_ (state: ACCEPTED) never ends for Spark Submit (with Spark 1.2.0 on YARN)

后端 未结 13 1551
说谎
说谎 2020-12-04 23:27

I am running kinesis plus spark application https://spark.apache.org/docs/1.2.0/streaming-kinesis-integration.html

I am running as below

command on ec2 inst

13条回答
  •  北海茫月
    2020-12-05 00:17

    There are three ways we can try to fix this issue.

    1. Check for spark process on your machine and kill it.

    Do

    ps aux | grep spark
    

    Take all the process id's with spark processes and kill them, like

    sudo kill -9 4567 7865
    
    1. Check for number of spark applications running on your cluster.

    To check this, do

    yarn application -list
    

    you will get an output similar to this:

    Total number of applications (application-types: [] and states: [SUBMITTED, ACCEPTED, RUNNING]):1
                    Application-Id      Application-Name        Application-Type          User       Queue               State         Final-State         Progress                        Tracking-URL
    application_1496703976885_00567       ta da                SPARK        cloudera       default             RUNNING           UNDEFINED              20%             http://10.0.52.156:9090
    

    Check for the application id's, if they are more than 1, or more than 2, kill them. Your cluster cannot run more than 2 spark applications at the same time. I am not 100% sure about this, but on cluster if you run more than two spark applications, it will start complaining. So, kill them Do this to kill them:

    yarn application -kill application_1496703976885_00567
    
    1. Check for your spark config parameters. For example, if you have set more executor memory or driver memory or number of executors on your spark application that may also cause an issue. So, reduce of any of them and run your spark application, that might resolve it.

提交回复
热议问题