0

I'm trying to store the results of a fit I made using the lmfit package for python in an hdf5 file using the h5py package for python.

Currently I find myself recreating the structure of the data object by hand (i.e. loop over all keys in dictionary, get values and save them).

I have the feeling there has to be a more efficient/pythonic way of saving such an object in an hdf5 file similar to how a pickle of an object would work.

Could anyone help me find a way to efficiently store the information contained in an lmfit.model.ModelFit or lmfit.parameter.Parameters object in an hdf5 file?

edited to show currently used.

def add_analysis_datagroup_to_file(self, group_name='Analysis'):
    try:
        self.analysis_group= self.f.create_group(group_name)
    except ValueError:
        print 'Datagroup name "%s" already exists in hdf5 file' %group_name
        self.analysis_group = self.f[group_name]

def save_fitted_parameters(self, fit_results=None):
    if fit_results is None:
        fit_results = self.fit_results
    try:
        fit_grp = self.analysis_group.create_group('Fitted Params')
    except:
        fit_grp = self.analysis_group['Fitted Params']
    for parname, par in self.fit_results.params.iteritems():
        try:
            par_group = fit_grp.create_group(parname)
        except:
            par_group = fit_grp[parname]
        par_dict = vars(par)
        for val_name, val in par_dict.iteritems():
            if val_name == '_val':
                val_name = 'value'
            if val_name == 'correl' and val is not None:
                try:
                    correl_group = par_group.create_group(val_name)
                except:
                    correl_group = par_group[val_name]
                for cor_name, cor_val in val.iteritems():
                    correl_group.attrs.create(name=cor_name, data=cor_val)
            else:
                try:
                    par_group.attrs.create(name=val_name, data=val)
                except:
                    pass
Adriaan Rol
  • 420
  • 2
  • 4
  • 12
  • Can you show the current code that you're using? –  Jan 06 '15 at 10:41
  • The likely answer though, is 'no': you will have to specify the individual groups and subgroups in the HDF 5 file separately. There may be a smart way to code this, but I don't believe there's an interface that automatically loops and recurses through a dict and creates a HDF 5 file with groups and subgroups. –  Jan 06 '15 at 10:43
  • @ Evert, this is the code I'm currently using, It's up in the original post because it is too big for the comment window – Adriaan Rol Jan 06 '15 at 12:49

1 Answers1

2

This is quite an old post, but I just had the same problem so hopefully this answer will help someone... You can use the built in method dumps() in the ModelResult class in lmfit to convert the file into json format which can then be saved as an hdf5 string. You can also use list comprehension to store an array of json files if you want to store multiple fit_results in place. the Parameter class as well as the Model class also have dumps methods. To reload use loads() (Again, Parameters and Models can be reloaded as they have loads methods as well)

    f = h5py.File('fit_result_example.hdf5','w')
    grp = f.create_group('group1')
    dt = h5py.special_dtype(vlen=str) 
    fit_results = np.asarray([fit_results.dumps()], dtype=dt) 
    grp.create_dataset('fit_results', data=fit_results)
    f.close()
derk
  • 36
  • 4