h5py

Colormap issue using animation in matplotlib

一个人想着一个人 提交于 2019-12-25 04:08:02
问题 I use matplotlib.animation to animate data in a 3D array named arr . I read data from a h5 file using h5py library and everything is OK. But when using animation, the colormap got stuck in first frame of the data range, and after some steps it shows unnormalized colors while plotting. Here is my code: import numpy as np import h5py import matplotlib.pyplot as plt import matplotlib.animation as animation import matplotlib.cm as cm f = h5py.File('ez.h5','r') arr = f["ez"][:,:,:] f.close() fig =

h5py : how to rename dimensions?

拟墨画扇 提交于 2019-12-25 03:48:29
问题 I created a new file whose handle is fw. fw.create_dataset('grp1/varname',data=arr) The groups are created before this command. arr is a numpy array with dimensions (2,3). The file is created successfully. However, the dimensions are named phony_0, and phony_1. How do I change them to say m and n ? In general how does one create dimensions within a group and then associate variables with them? I tried, fw['grp1/varname'].dims[0].label = 'm' But this does not have the desired effect. ncdump -h

h5py: Compound datatypes and scale-offset in the compression pipeline

混江龙づ霸主 提交于 2019-12-24 09:29:44
问题 Using Numpy and h5py, it is possible to create ‘compound datatype’ datasets to be stored in an hdf5-file: import h5py import numpy as np # # Create a new file using default properties. # file = h5py.File('compound.h5','w') # # Create a dataset under the Root group. # comp_type = np.dtype([('fieldA', 'i4'), ('fieldB', 'f4')]) dataset = file.create_dataset("comp", (4,), comp_type) It is also possible to use various compression filters in a ‘compression pipeline’, among them the ‘scale-offset’

h5py - Write object dynamically to file?

不羁岁月 提交于 2019-12-24 00:55:00
问题 I am trying to write regular python objects (which several key/value pairs) to a hdf5 file. I am using h5py 2.7.0 with python 3.5.2.3. Right now, I am trying to write one object in its entirety to a dataset: #...read dataset, store one data object in 'obj' #obj could be something like: {'value1': 0.09, 'state': {'angle_rad': 0.034903, 'value2': 0.83322}, 'value3': 0.3} dataset = h5File.create_dataset('grp2/ds3', data=obj) This produces an error as the underlying dtype can not be converted to

Adding data to existing h5py file along new axis using h5py

爷,独闯天下 提交于 2019-12-24 00:34:40
问题 I have some sample code that generates a 3d Numpy array -- I am then saving this data into a h5py file using h5 file. How can I then "append" the second dataset along the 4th dimension? Or, how can I write another 3d dataset along the 4th dimension (or new axis) of an existing .h5 file? I have read documentation that I could find, and none of the examples seem to address this. My code is shown below: import h5py import numpy as np dataset1 = np.random.rand(240,240,250); dataset2 = np.random

Attempt to open h5py file, returns errorno = 17, error message = 'file exists'

六眼飞鱼酱① 提交于 2019-12-23 17:40:22
问题 import numpy as np import h5py with h5py.File("testfile.hdf5", "w-") as f: arr = np.ones((5,2)) f["my dataset"] = arr dset = f["my dataset"] This code runs correctly the first time, but when run a second time, returns the following error: %run "C:\Users\James\Google Drive\Python Scripts\Python and HDF5\Chapter3.py" --------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) C:\Users\James\Google Drive\Python Scripts\Python and

h5py returning unexpected results in indexing

旧时模样 提交于 2019-12-23 05:12:33
问题 I'm attempting to fill an h5py dataset with a series of numpy arrays that I generate in sequence so my memory can handle it. The h5py array is initialised so that the first dimension can have any magnitude, f.create_dataset('x-data', (1, maxlen, 50), maxshape=(None, maxlen, 50)) After generating each numpy array X, I am using f['x-data'][alen:alen + len(data),:,:] = X Where for example, in the first array, alen=0 and len(data)=10056. I then increment alen so the next array will start from

How to get special derivative from an interpolated function

烈酒焚心 提交于 2019-12-23 04:50:13
问题 I have created a h5 file for a simple cube and then read it by python and finally use RegularGridInterpolator function to interpolate. Everything works perfectly for me. But, I want to know how can I change my code so that, I can get derivative from this interpolated function? For your kind information, I have given here my code: code for creating h5 file import numpy as np import h5py def f(x,y,z): return 2 * x**3 + 3 * y**2 - z x = np.linspace(-1, 1, 2) y = np.linspace(-1, 1, 2) z = np

Merge all h5 files using h5py

天大地大妈咪最大 提交于 2019-12-23 02:46:20
问题 I am Novice at coding. Can some one help with a script in Python using h5py wherein we can read all the directories and sub-directories to merge multiple h5 files into a single h5 file. 回答1: What you need is a list of all datasets in the file. I think that the notion of a recursive function is what is needed here. This would allow you to extract all 'datasets' from a group, but when one of them appears to be a group itself, recursively do the same thing until all datasets are found. For

How do I lazily concatenate “numpy ndarray”-like objects for sequential reading?

流过昼夜 提交于 2019-12-23 01:44:28
问题 I have a list of several large hdf5 files, each with a 4D dataset. I would like to obtain a concatenation of them on the first axis, as in, an array-like object that would be used as if all datasets were concatenated. My final intent is to sequentially read chunks of the data along the same axis (e.g. [0:100,:,:,:] , [100:200,:,:,:] , ...), multiple times. Datasets in h5py share a significant part of the numpy array API, which allows me to call numpy.concatenate to get the job done: files =