How to read all files in S3 folder/bucket using sparklyr in R?

只谈情不闲聊 提交于 2021-02-07 17:24:30

问题


I have tried below code & its combinations in order to read all files given in a S3 folder , but nothing seems to be working .. Sensitive information/code is removed from the below script. There are 6 files each with 6.5 GB .

#Spark Connection
sc<-spark_connect(master = "local" , config=config)


rd_1<-spark_read_csv(sc,name = "Retail_1",path = "s3a://mybucket/xyzabc/Retail_Industry/*/*",header = F,delimiter = "|")


# This is the S3 bucket/folder for files [One of the file names Industry_Raw_Data_000]
s3://mybucket/xyzabc/Retail_Industry/Industry_Raw_Data_000

This the error I get

Error: org.apache.spark.sql.AnalysisException: Path does not exist: s3a://mybucket/xyzabc/Retail_Industry/*/*;
at org.apache.spark.sql.execution.datasources.DataSource$.org$apache$spark$sql$execution$datasources$DataSource$$checkAndGlobPathIfNecessary(DataSource.scala:710)

回答1:


After spending few weeks on googling that issue , it is solved . Here ,the solution..

Sys.setenv(AWS_ACCESS_KEY_ID="abc") 
Sys.setenv(AWS_SECRET_ACCESS_KEY="xyz")

config<-spark_config()

config$sparklyr.defaultPackages <- c(
"com.databricks:spark-csv_2.10:1.5.0",
"com.amazonaws:aws-java-sdk-pom:1.10.34",
"org.apache.hadoop:hadoop-aws:2.7.3")



#Spark Connection
sc<-spark_connect(master = "local" , config=config)

# hadoop configurations
ctx <- spark_context(sc)
jsc <- invoke_static( sc,
"org.apache.spark.api.java.JavaSparkContext",
"fromSparkContext",
ctx
)

hconf <- jsc %>% invoke("hadoopConfiguration")  
hconf %>% invoke("set", "com.amazonaws.services.s3a.enableV4", "true")
hconf %>% invoke("set", "fs.s3a.fast.upload", "true")

folder_files<-"s3a://mybucket/abc/xyz"

rd_11<-spark_read_csv(sc,name = "Retail",path=folder_files,infer_schema = TRUE,header = F,delimiter = "|")


spark_disconnect(sc)


来源:https://stackoverflow.com/questions/53588696/how-to-read-all-files-in-s3-folder-bucket-using-sparklyr-in-r

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!