How to filter after split() in rdd spark scala?

后端 未结 0 1284
时光说笑
时光说笑 2020-12-17 10:20
1,John,NY
2,Bill,FL
3,Harry,TX

I have a textfile with above data.

val rdd = sc.textFile("/path").map(x=>(x.split(",         


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题