get datatype of column using pyspark

前端 未结 6 538
野的像风
野的像风 2021-01-31 15:58

We are reading data from MongoDB Collection. Collection column has two different values (e.g.: (bson.Int64,int) (int,float) ).

I a

6条回答
  •  忘掉有多难
    2021-01-31 16:40

    For anyone else who came here looking for an answer to the exact question in the post title (i.e. the data type of a single column, not multiple columns), I have been unable to find a simple way to do so.

    Luckily it's trivial to get the type using dtypes:

    def get_dtype(df,colname):
        return [dtype for name, dtype in df.dtypes if name == colname][0]
    
    get_dtype(my_df,'column_name')
    

    (note that this will only return the first column's type if there are multiple columns with the same name)

提交回复
热议问题