I\'m trying to filter a PySpark dataframe that has None
as a row value:
df.select(\'dt_mvmt\').distinct().collect()
[Row(dt_mvmt=u\'2016-03-27\
There are multiple ways you can remove/filter the null values from a column in DataFrame.
Lets create a simple DataFrame with below code:
date = ['2016-03-27','2016-03-28','2016-03-29', None, '2016-03-30','2016-03-31']
df = spark.createDataFrame(date, StringType())
Now you can try one of the below approach to filter out the null values.
# Approach - 1
df.filter("value is not null").show()
# Approach - 2
df.filter(col("value").isNotNull()).show()
# Approach - 3
df.filter(df["value"].isNotNull()).show()
# Approach - 4
df.filter(df.value.isNotNull()).show()
# Approach - 5
df.na.drop(subset=["value"]).show()
# Approach - 6
df.dropna(subset=["value"]).show()
# Note: You can also use where function instead of a filter.
You can also check the section "Working with NULL Values" on my blog for more information.
I hope it helps.