Spark - UbuntuVM - insufficient memory for the Java Runtime Environment

倾然丶 夕夏残阳落幕 提交于 2020-01-05 15:19:00

问题


I'm trying to install Spark1.5.1 on Ubuntu14.04 VM. After un-tarring the file, I changed the directory to the extracted folder and executed the command "./bin/pyspark" which should fire up the pyspark shell. But I got an error message as follows:

[ OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x00000000c5550000, 715849728, 0) failed; error='Cannot allocate memory' (errno=12) There is insufficient memory for the Java Runtime Environment to continue.

Native memory allocation (malloc) failed to allocate 715849728 bytes for committing reserved memory.

An error report file with more information is saved as: /home/datascience/spark-1.5.1-bin-hadoop2.6/hs_err_pid2750.log ]

Could anyone please give me some directions to sort out the problem?


回答1:


We need to set spark.executor.memory in conf/spark-defaults.conf file to a value specific to your machine. For example,

usr1@host:~/spark-1.6.1$ cp conf/spark-defaults.conf.template conf/spark-defaults.conf
nano conf/spark-defaults.conf
spark.driver.memory              512m

For more information, refer to the official documentation: http://spark.apache.org/docs/latest/configuration.html




回答2:


Pretty much what it says. It wants 7GB of RAM. So give the VM ~ 8GB of RAM.



来源:https://stackoverflow.com/questions/33245529/spark-ubuntuvm-insufficient-memory-for-the-java-runtime-environment

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!