flambo

Convert from clojure.lang.LazySeq to type org.apache.spark.api.java.JavaRDD

一个人想着一个人 提交于 2019-12-11 11:14:49
问题 I developed a function in clojure to fill in an empty column from the last non-empty value, I'm assuming this works, given (:require [flambo.api :as f]) (defn replicate-val [ rdd input ] (let [{:keys [ col ]} input result (reductions (fn [a b] (if (empty? (nth b col)) (assoc b col (nth a col)) b)) rdd )] (println "Result type is: "(type result)))) Got this: ;=> "Result type is: clojure.lang.LazySeq" The question is how do I convert this back to type JavaRDD, using flambo (spark wrapper) I

Convert from clojure.lang.LazySeq to type org.apache.spark.api.java.JavaRDD

妖精的绣舞 提交于 2019-12-11 03:57:38
问题 I developed a function in clojure to fill in an empty column from the last non-empty value, I'm assuming this works, given (:require [flambo.api :as f]) (defn replicate-val [ rdd input ] (let [{:keys [ col ]} input result (reductions (fn [a b] (if (empty? (nth b col)) (assoc b col (nth a col)) b)) rdd )] (println "Result type is: "(type result)))) Got this: ;=> "Result type is: clojure.lang.LazySeq" The question is how do I convert this back to type JavaRDD, using flambo (spark wrapper) I