I would like to be able to pass extra variables to functions that are called by spark_apply in sparklyr.
For example:
# setup
library(sparklyr)
sc <- spark_connect(master='local', packages=TRUE)
iris2 <- iris[,1:(ncol(iris) - 1)]
df1 <- sdf_copy_to(sc, iris2, repartition=5, overwrite=T)
# This works fine
res <- spark_apply(df1, function(x) kmeans(x, 3)$centers)
# This does not
k <- 3
res <- spark_apply(df1, function(x) kmeans(x, k)$centers)
As an ugly workaround, I can do what I want by saving values into R packages, and then referencing them. i.e
> myPackage::k_equals_three == 3
[1] TRUE
# This also works
res <- spark_apply(df1, function(x) kmeans(x, myPackage::k_equals_three)$centers)
Is there a better way to do this?
I don't have spark set up to test, but can you just create a closure?
kmeanswithk <- function(k) {force(k); function(x) kmeans(x, k)$centers})
k <- 3
res <- spark_apply(df1, kmeanswithk(k))
Basically just create a function to return a function then use that.
来源:https://stackoverflow.com/questions/46349921/how-to-pass-variables-to-functions-called-in-spark-apply