Filter Pyspark dataframe column with None value

前端 未结 10 1760
小鲜肉
小鲜肉 2020-11-29 18:10

I\'m trying to filter a PySpark dataframe that has None as a row value:

df.select(\'dt_mvmt\').distinct().collect()

[Row(dt_mvmt=u\'2016-03-27\         


        
10条回答
  •  误落风尘
    2020-11-29 18:31

    PySpark provides various filtering options based on arithmetic, logical and other conditions. Presence of NULL values can hamper further processes. Removing them or statistically imputing them could be a choice.

    Below set of code can be considered:

    # Dataset is df
    # Column name is dt_mvmt
    # Before filtering make sure you have the right count of the dataset
    df.count() # Some number
    
    # Filter here
    df = df.filter(df.dt_mvmt.isNotNull())
    
    # Check the count to ensure there are NULL values present (This is important when dealing with large dataset)
    df.count() # Count should be reduced if NULL values are present
    

提交回复
热议问题