I have a pyspark rdd:
proba_classe_0.take(2) [0.38030685472943737, 0.34728188900913715]
I want to transform on DF :
from pyspark.sql.types import FloatType fields = [ StructField('probabilite' , FloatType() ) ] schema = StructType(fields) df_proba_classe_1 = spark.createDataFrame(proba_classe_1, schema=schema) df_proba_classe_1.count()
I got a strange error :
TypeError: StructType can not accept object 0.6196931452705625 in type <class 'float'>