How to filter one spark dataframe against another dataframe

后端 未结 1 1068
南旧
南旧 2020-12-15 08:10

I\'m trying to filter one dataframe against another:

scala> val df1 = sc.parallelize((1 to 100).map(a=>(s\"user $a\", a*0.123, a))).toDF(\"name\", \"sc         


        
相关标签:
1条回答
  • 2020-12-15 08:46

    You want a (regular) inner join, not an outer join :)

    df1.join(df2, df1("user_id") === df2("valid_id"))
    
    0 讨论(0)
提交回复
热议问题