hdf5

Object dtype dtype('O') has no native HDF5 equivalent

前提是你 提交于 2021-01-27 11:51:46
问题 Well, it seems like a couple of similar questions were asked here in stack overflow, but none of them seem like answered correctly or properly, nor they described the exact examples. I have a problem with saving array or list into hdf5 ... I have a several files contains list of (n, 35) dimensions, where n may be different in each file. Each of them can be saved in hdf5 with code below. hdf = hf.create_dataset(fname, data=d) However, if I want to merge them to make in 3d the error occurs as

hdf5 file to pandas dataframe

荒凉一梦 提交于 2021-01-21 12:26:51
问题 I downloaded a dataset which is stored in .h5 files. I need to keep only certain columns and to be able to manipulate the data in it. To do this, I tried to load it in a pandas dataframe. I've tried to use: pd.read_hdf(path) But I get: No dataset in HDF5 file. I've found answers on SO (read HDF5 file to pandas DataFrame with conditions) but I don't need conditions, and the answer adds conditions about how the file was written but I'm not the creator of the file so I can't do anything about

hdf5 file to pandas dataframe

风流意气都作罢 提交于 2021-01-21 12:26:15
问题 I downloaded a dataset which is stored in .h5 files. I need to keep only certain columns and to be able to manipulate the data in it. To do this, I tried to load it in a pandas dataframe. I've tried to use: pd.read_hdf(path) But I get: No dataset in HDF5 file. I've found answers on SO (read HDF5 file to pandas DataFrame with conditions) but I don't need conditions, and the answer adds conditions about how the file was written but I'm not the creator of the file so I can't do anything about

h5py randomly unable to open object (component not found)

生来就可爱ヽ(ⅴ<●) 提交于 2021-01-07 02:55:48
问题 I'm trying to load hdf5 datasets into a pytorch training for loop. Regardless of num_workers in dataloader, this randomly throws "KeyError: 'Unable to open object (component not found)' " (traceback below). I'm able to start the training loop, but not able to get through 1/4 of one epoch without this error which happens for random 'datasets' (which are 2darrays each). I'm able to separately load these arrays in the console using the regular f['group/subroup'][()] so it doesn't appear like the

pytables installation failed

社会主义新天地 提交于 2020-12-31 04:34:31
问题 I do: sudo pip install --upgrade tables I get: /usr/bin/ld: cannot find -lhdf5 collect2: ld returned 1 exit status .. ERROR:: Could not find a local HDF5 installation. You may need to explicitly state where your local HDF5 headers and library can be found by setting the ``HDF5_DIR`` environment variable or by using the ``--hdf5`` command-line option. Complete output from command python setup.py egg_info: /usr/bin/ld: cannot find -lhdf5 however: $ echo $HDF5_DIR /opt/hdf5/ $ ls /opt/hdf5/ bin

parallel write to different groups with h5py

拟墨画扇 提交于 2020-12-15 06:18:41
问题 I'm trying to use parallel h5py to create an independent group for each process and fill each group with some data.. what happens is that only one group gets created and filled with data. This is the program: from mpi4py import MPI import h5py rank = MPI.COMM_WORLD.Get_rank() f = h5py.File('parallel_test.hdf5', 'w', driver='mpio', comm=MPI.COMM_WORLD) data = range(1000) dset = f.create_dataset(str(rank), data=data) f.close() Any thoughts on what is going wrong here? Thanks alot 回答1: Ok, so as

How to feed multiple NumPy arrays to a deep learning network in Keras?

只愿长相守 提交于 2020-12-06 06:36:35
问题 I have around 13 NumPy arrays stored as files that take around 24 gigabytes on disk. Each file is for a single subject and consists of two arrays: one containing input data (a list of 2D matrices, rows represent sequential time), and the other one containing labels of the data. My final goal is to feed all the data to a deep learning network I've written in Keras to classify new data. But I don't know how to do it without running out of memory. I've read about Keras's data generators, but

RuntimeError: Unable to create link (name already exists) when I append hdf5 file?

让人想犯罪 __ 提交于 2020-11-29 03:30:45
问题 I am trying to append the hdf5 dataset to the previous hdf5 dataset following error occured h5o.link(obj.id, self.id, name, lcpl=lcpl, lapl=self._lapl) File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "h5py/h5o.pyx", line 202, in h5py.h5o.link RuntimeError: Unable to create link (name already exists) sal_maps = np.array([], dtype=np.float32).reshape((0,) + img_size) probs = np.array([], dtype=np

RuntimeError: Unable to create link (name already exists) when I append hdf5 file?

时光总嘲笑我的痴心妄想 提交于 2020-11-29 03:30:06
问题 I am trying to append the hdf5 dataset to the previous hdf5 dataset following error occured h5o.link(obj.id, self.id, name, lcpl=lcpl, lapl=self._lapl) File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "h5py/h5o.pyx", line 202, in h5py.h5o.link RuntimeError: Unable to create link (name already exists) sal_maps = np.array([], dtype=np.float32).reshape((0,) + img_size) probs = np.array([], dtype=np