hdf

Opening a corrupted PyTables HDF5 file

寵の児 提交于 2019-12-11 02:37:27
问题 I am hoping for some help in opening a corrupted HDF5 file. I am accessing PyTables via Pandas , but a pd.read_hdf() call produces the following error. I don't know anything about the inner workings of PyTables . I believe the error was created because the process saving to the file (appending every 10 seconds or so) got duplicated, so there were then 2 identical processes appending. I am not sure why this would corrupt the file rather than duplicate data, but the two errors occurred together

Writing 2-D array int[n][m] to HDF5 file using Visual C++

柔情痞子 提交于 2019-12-11 00:54:16
问题 I'm just getting started with HDF5 and would appreciate some advice on the following. I have a 2-d array: data[][] passed into a method. The method looks like: void WriteData( int data[48][100], int sizes[48]) The size of the data is not actually 48 x 100 but rather 48 x sizes[i]. I.e. each row could be a different length! In one simple case I'm dealing with, all rows are the same size (but not 100), so you can say that the array is 48 X sizes[0]. How best to write this to HDF5? I have some

Creating reference to HDF dataset in H5py using astype

夙愿已清 提交于 2019-12-10 20:38:18
问题 From the h5py docs, I see that I can cast a HDF dataset as another type using astype method for the datasets. This returns a contextmanager which performs the conversion on-the-fly. However, I would like to read in a dataset stored as uint16 and then cast it into float32 type. Thereafter, I would like to extract various slices from this dataset in a different function as the cast type float32 . The docs explains the use as with dataset.astype('float32'): castdata = dataset[:] This would cause

Images saved as HDF5 arent colored

旧时模样 提交于 2019-12-10 17:15:28
问题 Im currently working on a program that converts text files and jpg-images into the HDF5-Format. Opened with the HDFView 3.0, it seems that the Images are only saved in greyscales. hdf = h5py.File("Sample.h5") img = Image.open("Image.jpg") data = np.asarray((img), dtype="uint8") hdf.create_dataset("Photos/Image 1", data=data, dtype='uint8') dset = hdf.get("Photos/Image 1") dset.attrs['CLASS'] = 'IMAGE' dset.attrs['IMAGE_VERSION'] = '1.2' arr = np.asarray([0, 255], dtype=np.uint8) dset.attrs[

Read the properties of HDF file in Python

谁说胖子不能爱 提交于 2019-12-08 12:18:03
问题 I have a problem reading hdf file in pandas. As of now, I don't know the keys of the file. How do I read the file [data.hdf] in such a case? And, my file is .hdf not .h5 , Does it make a difference it terms data fetching? I see that you need a 'group identifier in the store' pandas.io.pytables.read_hdf(path_or_buf, key, **kwargs) I was able to get the metadata from pytables File(filename=data.hdf, title='', mode='a', root_uep='/', filters=Filters(complevel=0, shuffle=False, fletcher32=False,

hdf5 error when format=table, pandas pytables

▼魔方 西西 提交于 2019-12-08 09:05:09
问题 It seems that I get an error when format=table but no error with format=fixed . Here is the command. What's weird is that it still seems to load the data. I just have to figure out a way to move past this. And it would give me peace of mind to not have any error. The dataframe is preprocessed, types set within the columns. The command I run is: hdf = pd.HDFStore('path-to-file') hdf.put('df',df,format='table') The error I get is: HDF5ExtError: HDF5 error back trace File "../../../src/H5Dio.c",

Using std:: string in hdf5 creates unreadable output

亡梦爱人 提交于 2019-12-07 12:56:09
问题 I'm currently using hdf5 1.8.15 on Windows 7 64bit. The sourcecode of my software is saved in files using utf8 encoding. As soon as I call any hdf5 function supporting std:: string, the ouput gets cryptic But if I use const char* instead of std::string , everything works fine. This applies also to the filename. Here is a short sample: std::string filename_ = "test.h5"; H5::H5File file( filename_.c_str(), H5F_ACC_TRUNC); // works H5::H5File file( filename_, H5F_ACC_TRUNC); // filename is not

Using std:: string in hdf5 creates unreadable output

╄→гoц情女王★ 提交于 2019-12-05 21:35:26
I'm currently using hdf5 1.8.15 on Windows 7 64bit. The sourcecode of my software is saved in files using utf8 encoding. As soon as I call any hdf5 function supporting std:: string, the ouput gets cryptic But if I use const char* instead of std::string , everything works fine. This applies also to the filename. Here is a short sample: std::string filename_ = "test.h5"; H5::H5File file( filename_.c_str(), H5F_ACC_TRUNC); // works H5::H5File file( filename_, H5F_ACC_TRUNC); // filename is not readable or // hdf5 throws an exception I guess that this problem is caused by different encodings used

It is possible to read .Rdata file format from C or Fortran?

匆匆过客 提交于 2019-12-04 02:13:08
问题 I'm working writing some R extensions on C (C functions to be called from R). My code needs to compute a statistic using 2 different datasets at the same time, and I need to perform this with all possible pair combinations. Then, I need all these statistics (very large arrays) to continue the calculation on the C side. Those files are very large, typically ~40GB, and that's my problem. To do this on C called by R, first I need to load all the datasets in R to pass them then to the C function

pd.read_hdf throws 'cannot set WRITABLE flag to True of this array'

南楼画角 提交于 2019-12-03 19:48:13
问题 When running pd.read_hdf('myfile.h5') I get the following traceback error: [[...some longer traceback]] ~/.local/lib/python3.6/site-packages/pandas/io/pytables.py in read_array(self, key, start, stop) 2487 2488 if isinstance(node, tables.VLArray): -> 2489 ret = node[0][start:stop] 2490 else: 2491 dtype = getattr(attrs, 'value_type', None) ~/.local/lib/python3.6/site-packages/tables/vlarray.py in getitem (self, key) ~/.local/lib/python3.6/site-packages/tables/vlarray.py in read(self, start,