Questions tagged [h5py]

h5py is a NumPy-compatible Python module for handling The Hierarchical Data Format (HDF5) files.

h5py is a NumPy-compatible Python module for handling The Hierarchical Data Format (HDF5) files.

Main features

  • Free (BSD licensed)
  • limited dependencies (Python, NumPy, HDF5 libs.)
  • includes both a low level c-like HDF5 interface and a high level Python/NumPy style interface
  • directly interact with datasets using NumPy metaphors, such as slicing
  • datatypes specified using standard NumPy dtype objects

Some links to get started

1301 questions
0
votes
1 answer

h5py IOError: unable to open file

For some strange reason, h5py is unable to find an input file. It consistently throws this error unless the input file is in the same directory as the module that's attempting to open the file. This is strange because it used to work fine a while…
0
votes
1 answer

How to read contents of datasets of a h5py file into a numpy array given a list of keys?

Inputs to my function are a h5py file and a text file. Text file has two columns. First column has some utterance information and second column has the speaker information (for that utterance). The keys of h5py file (created using create_datasets)…
user1540393
  • 69
  • 1
  • 8
0
votes
1 answer

H5py IOError: Can't write data

For a deep learning project, I reorganize data from hdf5 files into a big training_data.hfd5 containing batch0...batch_max When running it in local on my laptop it works, but the process is a bit long (much data), so I tried to run it on big-CPU…
Gericault
  • 219
  • 1
  • 3
  • 8
0
votes
0 answers

VGG16 weights file for keras 2 cannot be loaded

I am doing vgg16 finetuning using Keras 2. Before actually refine the weights of the top layers, I need to load the VGG16 model weights to avoid long-time training. I downloaded the VGG16 weights from this link. The weights I tried included…
0
votes
1 answer

h5py open file blocks with MPI

I'm trying to open a hdf5 file using h5py with mpi by executing print("Opening...") f = h5py.File(file_path, "r", driver='mpio', comm=MPI.COMM_WORLD) print("Done") For some reason, this line blocks when executed in my project. I tried to create a…
Fabian N.
  • 3,807
  • 2
  • 23
  • 46
0
votes
0 answers

Save and load big array (> 100 gb) python, and/or reduce size

I need to save a really big array (is a matrix of doubles, with size of 5e5 x 3e4. The context is: I have a 1D simulation of viscous disc, each row is a snapshot of the simulation ( the surface density). all the data is relevant (more or less), so…
Ardemion
  • 33
  • 1
  • 7
0
votes
1 answer

h5f.create_dataset causes MemoryError

I currently trying to store a big numpy.ndarray using h5py. print len(train_input_data_interweawed_normalized) print train_input_data_interweawed_normalized[0].shape raw_input("Something") print "Storing Train input" h5f =…
Fixining_ranges
  • 223
  • 1
  • 13
0
votes
2 answers

Optimising HDF5 dataset for Read/Write speed

I'm currently running an experiment where I scan a target spatially and grab an oscilloscope trace at each discrete pixel. Generally my trace lengths are 200Kpts. After scanning the entire target I assemble these time domain signals spatially and…
Alex Taylor
  • 3
  • 1
  • 2
0
votes
1 answer

how to create variable length ascii encoded string with h5py

I want to write a variable-length string using python with h5py. If I use dset = grp.create_dataset('data_set_name',{1},dtype=h5py.special_dtype(vlen=str)) dset[0] = 'some_string' then h5dump tells me DATASET "data_set_name" { DATATYPE …
Walter
  • 44,150
  • 20
  • 113
  • 196
0
votes
0 answers

Keras/Tensorflow/h5py - KeyError: "Can't open attribute (Can't locate attribute: 'nb_layers')"

I have the following code snippet: def save_bottleneck_features(): """builds the pretrained vgg16 model and runs it on our training and validation datasets""" datagen = ImageDataGenerator(rescale=1./255) # match the vgg16 architecture…
Simplicity
  • 47,404
  • 98
  • 256
  • 385
0
votes
1 answer

Logical way to create a custom h5py build with conda

I am having an design issue with some packages in conda. I have done the following steps to get where I am: Build custom version of hdf5 (enabling certain compiler flags) Build custom version of h5py (with modifications calling different APIs from…
user2886057
  • 646
  • 1
  • 5
  • 15
0
votes
2 answers

Fastest way to go through a long list of arrays

I have a data set from electrophysiological recordings in a hdf5 file in the form of what is really close to numpy arrays from my understanding and what I am trying to do is access it in the most efficient and fast way. Let me explain: The dataset…
ukey
  • 13
  • 2
0
votes
2 answers

how to import h5py on datalab?

Does anybody know how to install h5py on datalab? pip install h5py doesn't work. apt-get install python-h5py is working in the shell but it doesn't recognize the package in datalab notebook! Thnaks
Nima
  • 55
  • 8
0
votes
2 answers

HDF file not saving properly with h5py

I'm trying to save a numpy array in a HDF file with h5py as follows: with h5py.File("mfcc_aligned.hdf", "w") as aligned_f: # do stuff to create two numpy arrays, training_X and training_Y print(len(training_X)) # this returns the number of…
Jess
  • 1,515
  • 3
  • 23
  • 32
0
votes
2 answers

hdf to ndarray in numpy - fast way

I am looking for a fast way to set my collection of hdf files into a numpy array where each row is a flattened version of an image. What I exactly mean: My hdf files store, beside other informations, images per frames. Each file holds 51 frames with…
mrks
  • 141
  • 2
  • 15