sql sparklyr sparkr dataframe conversions on databricks

痞子三分冷 提交于 2019-12-23 20:55:20

问题


I have the sql table on the databricks created using the following code

%sql 
CREATE TABLE data 
USING CSV 
OPTIONS (header "true", inferSchema "true") 
LOCATION "url/data.csv" 

The following code converts that table to sparkr and r dataframe, respectively:

%r
library(SparkR)
data_spark <- sql("SELECT * FROM data")
data_r_df <- as.data.frame(data_spark)

But I don't know how should I convert any or all of these dataframes into sparklyr dataframe to leverage parallelization of sparklyr?


回答1:


Just

sc <- spark_connect(...)

data_spark <- dplyr::tbl(sc, "data")

or

sc %>% spark_session() %>% invoke("sql", "SELECT * FROM data") %>% sdf_register()


来源:https://stackoverflow.com/questions/51504713/sql-sparklyr-sparkr-dataframe-conversions-on-databricks

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!