How to check the Spark version

我只是一个虾纸丫 提交于 2019-11-29 10:34:24

问题


as titled, how do I know which version of spark has been installed in the CentOS?

The current system has installed cdh5.1.0.


回答1:


If you use Spark-Shell, it appears in the banner at the start.

Programatically, SparkContext.version can be used.




回答2:


Open Spark shell Terminal, run sc.version




回答3:


You can use spark-submit command: spark-submit --version




回答4:


In Spark 2.x program/shell,

use the

spark.version 

Where spark variable is of SparkSession object

Using the console logs at start of spark-shell

[root@bdhost001 ~]$ spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.2.0
      /_/

Without entering into code/shell

spark-shell --version

[root@bdhost001 ~]$ spark-shell --version
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.2.0
      /_/

Type --help for more information.

spark-submit --version

[root@bdhost001 ~]$ spark-submit --version
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.2.0
      /_/

Type --help for more information.



回答5:


If you are using Databricks and talking to a notebook, just run :

spark.version



回答6:


use below to get the spark version

spark-submit --version



回答7:


Which ever shell command you use either spark-shell or pyspark, it will land on a Spark Logo with a version name beside it.

$ pyspark
$ Python 2.6.6 (r266:84292, May 22 2015, 08:34:51) [GCC 4.4.7 20120313 (Red Hat 4.4.7-15)] on linux2 ............ ........... Welcome to
version 1.3.0




回答8:


If you are using pyspark, the spark version being used can be seen beside the bold Spark logo as shown below:

manoj@hadoop-host:~$ pyspark
Python 2.7.6 (default, Jun 22 2015, 17:58:13)
[GCC 4.8.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 1.6.0
      /_/

Using Python version 2.7.6 (default, Jun 22 2015 17:58:13)
SparkContext available as sc, HiveContext available as sqlContext.
>>>

If you want to get the spark version explicitly, you can use version method of SparkContext as shown below:

>>>
>>> sc.version
u'1.6.0'
>>>



回答9:


If you are on Zeppelin notebook you can run:

sc.version 

to know the scala version as well you can ran:

util.Properties.versionString



回答10:


scala> spark.version
res9: String = 2.4.4


来源:https://stackoverflow.com/questions/29689960/how-to-check-the-spark-version

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!