I\'ve a Spark cluster with 10 nodes, and I\'m getting this exception after using the Spark Context for the first time:
14/11/20 11:15:13 ERROR UserGroupInfor
I had similar problem and I managed to get around it by using cluster deploy mode when submitting the application to Spark.
(Because even allowing all the incoming traffic to both my master and the single slave didn't allow me to use the client deploy mode. Before changing them I had default security group (AWS firewall) settings set up by Spark EC2 scripts from Spark 1.2.0).