I am trying to debug a Spark Application on a cluster using a master and several worker nodes. I have been successful at setting up the master node and worker nodes using Sp
You could run the Spark application in local mode if you just need to debug the logic of your transformations. This can be run in your IDE and you'll be able to debug like any other application:
val conf = new SparkConf().setMaster("local").setAppName("myApp")
You're of course not distributing the problem with this setup. Distributing the problem is as easy as changing the master to point to your cluster.