Correct way of writing two floats into a regular txt
问题 I am running a big job, in cluster mode. However, I am only interested in two floats numbers, which I want to read somehow, when the job succeeds. Here what I am trying: from pyspark.context import SparkContext if __name__ == "__main__": sc = SparkContext(appName='foo') f = open('foo.txt', 'w') pi = 3.14 not_pi = 2.79 f.write(str(pi) + "\n") f.write(str(not_pi) + "\n") f.close() sc.stop() However, 'foo.txt' doesn't appear to be written anywhere (probably it gets written in an executor, or