Differences between null and NaN in spark? How to deal with it?

前端 未结 3 1008
失恋的感觉
失恋的感觉 2020-12-14 07:32

In my DataFrame, there are columns including values of null and NaN respectively, such as:

df = spark.createDataFrame([(1, float(\'nan\')), (None, 1.0)], (\         


        
3条回答
  •  失恋的感觉
    2020-12-14 07:53

    you can deal with it using this code

    df = df.where(pandas.notnull(df), None)
    

    The code will convert any NaN value into null

    Below is the reffrence link

    Link

提交回复
热议问题