How to implement Stanford CoreNLP wrapper for Apache Spark using sparklyr?
问题 I am trying to create a R package so I can use the Stanford CoreNLP wrapper for Apache Spark (by databricks) from R. I am using the sparklyr package to connect to my local Spark instance. I created a package with the following dependency function spark_dependencies <- function(spark_version, scala_version, ...) { sparklyr::spark_dependency( jars = c( system.file( sprintf("stanford-corenlp-full/stanford-corenlp-3.6.0.jar"), package = "sparkNLP" ), system.file( sprintf("stanford-corenlp-full