Spark Scala list folders in directory

前端 未结 9 2460
北恋
北恋 2020-12-05 09:41

I want to list all folders within a hdfs directory using Scala/Spark. In Hadoop I can do this by using the command: hadoop fs -ls hdfs://sandbox.hortonworks.com/demo/<

9条回答
  •  情书的邮戳
    2020-12-05 10:21

    object HDFSProgram extends App {    
      val uri = new URI("hdfs://HOSTNAME:PORT")    
      val fs = FileSystem.get(uri,new Configuration())    
      val filePath = new Path("/user/hive/")    
      val status = fs.listStatus(filePath)    
      status.map(sts => sts.getPath).foreach(println)    
    }
    

    This is sample code to get list of hdfs files or folder present under /user/hive/

提交回复
热议问题