Getting null pointer exception when running saveAsNewAPIHadoopDataset in scala spark2 to hbase
问题 I am saving a puts RDD to Hbase using saveAsNewAPIHadoopDataset. Below is my job creation and submition. val outputTableName = "test3" val conf2 = HBaseConfiguration.create() conf2.set("hbase.zookeeper.quorum", "xx.xx.xx.xx") conf2.set("hbase.mapred.outputtable", outputTableName) conf2.set("mapreduce.outputformat.class", "org.apache.hadoop.hbase.mapreduce.TableOutputFormat") val job = createJob(outputTableName, conf2) val outputTable = sc.broadcast(outputTableName) val hbasePuts = simpleRdd