What I need to do is to read data from hundreds of links, and among them some of the links contains no data, therefore, as the codes here:
urls <-paste0(\
Does this help?
dims <- sapply(myData, dim)[2,]
bad_Ones <- myData[dims==1]
good_Ones <- myData[dims>1]
If myData
still grabs something off the station page, the above code should separate the myData
list into two separate groups. good_Ones
would be the list you would want to work with. (assuming the above is accurate, of course)