httr

How to POST multipart/related content with httr (for Google Drive API)

旧巷老猫 提交于 2019-12-01 17:42:31
I got simple file uploads to Google Drive working using httr. The problem is that every document is uploaded as "untitled", and I have to PATCH the metadata to set the title. The PATCH request occasionally fails. According to the API , I ought to be able to do a multipart upload, allowing me to specify the title as part of the same POST request that uploads the file. res<-POST( "https://www.googleapis.com/upload/drive/v2/files?convert=true", config(token=google_token), body=list(y=upload_file(file)) ) id<-fromJSON(rawToChar(res$content))$id if(is.null(id)) stop("Upload failed") url<-paste(

R: use rvest (or httr) to log in to a site requiring cookies

橙三吉。 提交于 2019-12-01 11:01:15
I'm trying to automate the shibboleth-based login process for the UK Data Service in R. One can sign up for an account to login here . A previous attempt to automate this process is found in this question, automating the login to the uk data service website in R with RCurl or httr . I thought the excellent answers to this question, how to authenticate a shibboleth multi-hostname website with httr in R , were going to get me there, but I've run into a wall. And, yes, RSelenium provides an alternative— which I've actually tried —but my experience with RSelenium is that it is always flaking out

How to specify certificate, key and root certificate with httr for certificate based authentication?

蓝咒 提交于 2019-12-01 09:17:16
I am trying to access data using httr library from server which expects certificate based authentication. I have certificate (cert.pem), key file (key.pem) and root certificate (caroot.pem) Following curl works. curl -H "userName:sriharsha@rpc.com" --cert cert.pem --key certkey.key --cacert caroot.pem https://api.somedomain.com/api/v1/timeseries/klog?limit=1 How can specify certkey.key and caroot.pem to httr GET request. I am trying with following R command but couldn't find option to specify cert key and caroot. cafile=???? r<-GET(" https://api.somedomain.com/api/v1/timeseries/klog ", query =

R rvest: could not find function “xpath_element”

╄→尐↘猪︶ㄣ 提交于 2019-12-01 04:40:59
I am trying to simply replicate the example of rvest::html_nodes() , yet encounter an error: library(rvest) ateam <- read_html("http://www.boxofficemojo.com/movies/?id=ateam.htm") html_nodes(ateam, "center") Error in do.call(method, list(parsed_selector)) : could not find function "xpath_element" The same happens if I load packages such as httr , xml2 , selectr . I seem to have the latest version of these packages too... In which packages are functions such as xpath_element , xpath_combinedselector located? How do I get it to work? Note that I am running on Ubuntu 16.04, so that code might

R rvest: could not find function “xpath_element”

孤街醉人 提交于 2019-12-01 02:51:04
问题 I am trying to simply replicate the example of rvest::html_nodes() , yet encounter an error: library(rvest) ateam <- read_html("http://www.boxofficemojo.com/movies/?id=ateam.htm") html_nodes(ateam, "center") Error in do.call(method, list(parsed_selector)) : could not find function "xpath_element" The same happens if I load packages such as httr , xml2 , selectr . I seem to have the latest version of these packages too... In which packages are functions such as xpath_element , xpath

R Disparity between browser and GET / getURL

心已入冬 提交于 2019-11-30 22:35:44
I'm trying to download the content from a page and I'm finding that the response data is either malformed or incomplete, as if GET or getURL are pulling before those data are loaded. library(httr) library(RCurl) url <- "https://www.vanguardcanada.ca/individual/etfs/etfs.htm" d1 <- GET(url) # This shows a lot of {{ moustache style }} code that's not filled d2 <- getURL(url) # This shows "" as if it didn't get anything I'm not sure how to proceed. My goal is to get the numbers associated with the links that show in the browser: https://www.vanguardcanada.ca/individual/etfs/etfs-detail-overview

R Disparity between browser and GET / getURL

时光怂恿深爱的人放手 提交于 2019-11-30 17:40:01
问题 I'm trying to download the content from a page and I'm finding that the response data is either malformed or incomplete, as if GET or getURL are pulling before those data are loaded. library(httr) library(RCurl) url <- "https://www.vanguardcanada.ca/individual/etfs/etfs.htm" d1 <- GET(url) # This shows a lot of {{ moustache style }} code that's not filled d2 <- getURL(url) # This shows "" as if it didn't get anything I'm not sure how to proceed. My goal is to get the numbers associated with

SSL verification causes RCurl and httr to break - on a website that should be legit

╄→尐↘猪︶ㄣ 提交于 2019-11-30 13:08:52
i'm trying to automate the login of the UK's data archive service. that website is obviously trustworthy. unfortunately, both RCurl and httr break at SSL verification. my web browser doesn't give any sort of warning. i can work around the issue by using ssl.verifypeer = FALSE in RCurl but i'd like to understand what's going on? # breaks library(httr) GET( "https://www.esds.ac.uk/secure/UKDSRegister_start.asp" ) # breaks library(RCurl) cert <- system.file("CurlSSL/cacert.pem", package = "RCurl") getURL("https://www.esds.ac.uk/secure/UKDSRegister_start.asp",cainfo = cert) # works library(RCurl)

Getting data in R as dataframe from web source

我是研究僧i 提交于 2019-11-30 09:02:41
问题 I am trying to load some air pollution background data directly into R as a data.frame using the RCurl package. The website in question has 3 dropdown boxes to choose options before downloading the .csv file as shown in figure below: I am trying to select 3 values from the drop down box and download the data using "Download CSV" button directly into R as a data.frame. I want to download the different combinations of multiple years and multiple pollutants for a specific site. In other posts on

R: Download image using rvest

瘦欲@ 提交于 2019-11-30 07:31:47
I'm attempting to download a png image from a secure site through R. To access the secure site I used Rvest which worked well. So far I've extracted the URL for the png image. How can I download the image of this link using rvest? Functions outside of the rvest function return errors due to not having permission. Current attempts library(rvest) uastring <- "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.36" session <- html_session("https://url.png", user_agent(uastring)) form <- html_form(session)[[1]] form <- set_values(form, username = "***"