0

How can I access the 'optimized' thetas from the GPR (Kriging-type) in scikit-learn? I want to get the theta for each of the variable/parameter, to verify the influence of those variables/parameters on the model's output.

I have tried the following code, but what I found (for simple RBF Kernel) is not what I seek. By the Theta I mean the "width" scale factor of the Gaussian bell curve (RBF).

print('model kernel: ', model.kernel_)
model kernel:  0.188**2 * RBF(length_scale=3.19)

print('model parameters: ', model.get_params())
model parameters:  {'alpha': 1e-10, 'copy_X_train': True, 'kernel__k1': 1**2, 'kernel__k2': RBF(length_scale=1), 'kernel__k1__constant_value': 1, 'kernel__k1__constant_value_bounds': (1e-05, 100000.0), 'kernel__k2__length_scale': 1.0, 'kernel__k2__length_scale_bounds': (0.001, 1000.0), 'kernel': 1**2 * RBF(length_scale=1), 'n_restarts_optimizer': 20, 'normalize_y': False, 'optimizer': 'fmin_l_bfgs_b', 'random_state': None}

print('kernel parameters: ', model.kernel_.get_params())
kernel parameters:  {'k1': 0.188**2, 'k2': RBF(length_scale=3.19), 'k1__constant_value': 0.0354057439588758, 'k1__constant_value_bounds': (1e-05, 100000.0), 'k2__length_scale': 3.1913782768411574, 'k2__length_scale_bounds': (0.001, 1000.0)}
desertnaut
  • 57,590
  • 26
  • 140
  • 166
prz
  • 31
  • 3
  • Kernels have a property `theta`, is that what you need? `get_params` only shows initialization parameters, not fitted attributes. – Ben Reiniger May 11 '22 at 00:44
  • @BenReiniger - thank you, I missed that. Nonetheless, my thetas are always zero... is there any specific reason for that? Using simple isotropic RBF Kernel: kernel = 1 * RBF(length_scale=1.0, length_scale_bounds=(1e-3, 1e3)) I request following data: print('model kernel: ', model.kernel_, ' obtained after ', str(round(t2-t1,1)), ' seconds of fitting') print('kernel thetas (log transformed): ', str(kernel.theta)) and get: model kernel: 0.777**2 * RBF(length_scale=0.834) + 0.034**2 * 0.034**2 obtained after 545.4 seconds of fitting kernel thetas (log transformed): [0. 0. 0. 0.] – prz Jun 07 '22 at 13:42

0 Answers0