问题
I want to import a very large geojson into a simple features object using st_read in R, however the hardware demands seem to be large when converting from geojson to sf. for example, importing the microsoft data for building footprints for Ohio (https://github.com/Microsoft/USBuildingFootprints) which is a 1.2 GB geojson eats up over 32 GB of RAM when converting. Is their a method for iterating through rows of a geojson in a function so I can import parts of the whole file without eating up all that RAM, similar to skip rows in read.csv?
回答1:
Using library(geojsonsf)
seems to work without issue on my Mac with 16gb RAM
library(geojsonsf)
library(sf)
sf <- geojsonsf::geojson_sf("~/Downloads/Ohio.geojson")
sf
# Simple feature collection with 5449419 features and 0 fields
# geometry type: POLYGON
# dimension: XY
# bbox: xmin: -84.82027 ymin: 38.40334 xmax: -80.51887 ymax: 41.97041
# epsg (SRID): 4326
# proj4string: +proj=longlat +datum=WGS84 +no_defs
# First 10 features:
# geometry
# 1 POLYGON ((-84.81222 39.9087...
# 2 POLYGON ((-84.80084 39.8882...
# 3 POLYGON ((-84.78565 39.8811...
# 4 POLYGON ((-84.7373 39.9014,...
# 5 POLYGON ((-84.73916 39.8980...
# 6 POLYGON ((-84.80422 39.8646...
# 7 POLYGON ((-84.80025 39.8592...
# 8 POLYGON ((-84.79336 39.8593...
# 9 POLYGON ((-84.79268 39.8604...
# 10 POLYGON ((-84.80194 39.8639...
来源:https://stackoverflow.com/questions/52899355/using-st-read-to-import-large-geojson-in-iterations-r