Are there any dependencies between Spark and Hadoop?
If not, are there any features I\'ll miss when I run
Yes, spark can run without hadoop. All core spark features will continue to work, but you'll miss things like easily distributing all your files (code as well as data) to all the nodes in the cluster via hdfs, etc.