Possible to put records that aren't same length as header records to bad_record directory

巧了我就是萌 提交于 2019-12-04 20:53:49
val a = spark.sparkContext.textFile(pathOfYourFile)
val size = a.first.split("\\|").length
a.filter(i => i.split("\\|",-1).size != size).saveAsTextFile("/mnt/adls/udf_databricks/error")

The below code is supported by databricks implementation of spark.I dont see schema mapping in your code. could you map it and try ?

.option("badRecordsPath", "/mnt/adls/udf_databricks/error")

Change your code like below,

val customSchema = StructType(Array(
    StructField("col_a", StringType, true),
    StructField("col_b", StringType, true),
    StructField("col_c", StringType, true),
    StructField("col_d", StringType, true)))

val df = spark.read
   .option("sep", props.inputSeperator)
   .option("header", "true")
   .option("badRecordsPath", "/mnt/adls/udf_databricks/error")
   .schema(customSchema)
   .csv(inputLoc)

More detail's you can refer Datbricks doc on badrecordspath

Thanks, Karthick

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!