How to read a file using sparkstreaming and write to a simple file using Scala?
问题 I'm trying to read a file using a scala SparkStreaming program. The file is stored in a directory on my local machine and trying to write it as a new file on my local machine itself. But whenever I write my stream and store it as parquet I end up getting blank folders. This is my code : Logger.getLogger("org").setLevel(Level.ERROR) val spark = SparkSession .builder() .master("local[*]") .appName("StreamAFile") .config("spark.sql.warehouse.dir", "file:///C:/temp") .getOrCreate() import spark