python-xarray

Load selection GFS-ensemble openDAP data into memory (Python)

让人想犯罪 __ 提交于 2020-12-06 12:24:14
问题 I want to download a subselection from GFS-ensemble data from an OpenDAP server via netCDF and xarray. However, when trying to load the subselection into memory, the program crashes after a while returning a RuntimeError (netCDF: I/O failure). The amount of data points I wish to obtain is 13650, therefore the data size should be easily handleable in Python. Oddly enough, I do not experience this problem when downloading GFS-data or NCEP-Reanalysis data. This makes me believe that the issue

Load selection GFS-ensemble openDAP data into memory (Python)

旧城冷巷雨未停 提交于 2020-12-06 12:24:12
问题 I want to download a subselection from GFS-ensemble data from an OpenDAP server via netCDF and xarray. However, when trying to load the subselection into memory, the program crashes after a while returning a RuntimeError (netCDF: I/O failure). The amount of data points I wish to obtain is 13650, therefore the data size should be easily handleable in Python. Oddly enough, I do not experience this problem when downloading GFS-data or NCEP-Reanalysis data. This makes me believe that the issue

Load selection GFS-ensemble openDAP data into memory (Python)

好久不见. 提交于 2020-12-06 12:23:59
问题 I want to download a subselection from GFS-ensemble data from an OpenDAP server via netCDF and xarray. However, when trying to load the subselection into memory, the program crashes after a while returning a RuntimeError (netCDF: I/O failure). The amount of data points I wish to obtain is 13650, therefore the data size should be easily handleable in Python. Oddly enough, I do not experience this problem when downloading GFS-data or NCEP-Reanalysis data. This makes me believe that the issue

Writing xarray multiindex data in chunks

隐身守侯 提交于 2020-12-02 06:50:40
问题 I am trying to efficiently restructure a large multidimentional dataset. Let assume I have a number of remotely sensed images over time with a number of bands with coordinates x y for pixel location, time for time of image acquisition, and band for different data collected. In my use case lets assume the xarray coord lengths are roughly x (3000), y (3000), time (10), with bands (40) of floating point data. So 100gb+ of data. I have been trying to work from this example but I am having trouble

Writing xarray multiindex data in chunks

耗尽温柔 提交于 2020-12-02 06:48:12
问题 I am trying to efficiently restructure a large multidimentional dataset. Let assume I have a number of remotely sensed images over time with a number of bands with coordinates x y for pixel location, time for time of image acquisition, and band for different data collected. In my use case lets assume the xarray coord lengths are roughly x (3000), y (3000), time (10), with bands (40) of floating point data. So 100gb+ of data. I have been trying to work from this example but I am having trouble