How to load a file from the local file system to Spark using sc.textFile? Do I need to change any -env variables? Also when I tried the same on my windows
This error happens when you run spark in a cluster. When you submit a job to spark cluster the cluster manager(YARN or Mesos or any) will submit it to worker node. When the worker node trying to find the path of the file we need to load into spark it fails because the worker doesn't have such file. So try running spark-shell in local mode and try again,
\bin\spark-shell --master local
sc.textFile("file:///C:/Users/swaapnika/Desktop/to do list")
let me know if this helps.