0

I am using RBF 2D interpolator for a cloud point Z = f(X,Y) of a section of terrain containing a mountain;

When I interpolate over a grid, the edges of the grid get "lifted", because the default value (returned when there are no nearby points) seems to be the AVERAGE of the Z coordinates. Then the image I get is not that of a mountain, but that of a mountain "sunk", imprinted on a plane with heigth well above zero.

Below, a cross-section of my cloudpoint is represented in green, and a continuous slice of interpolated points is represented in pink.

enter image description here

The desired behavior would be to get Z = 0 when no nearby points.

Is there a way I can tell the interpolator I want a zero value as default, even if cloud point does not intersect zero plane?

My current code is like this (C#):

        alglib.rbfcreate(2, 1, out _model);

        alglib.rbfsetpoints(_model, point_cloud);

        alglib.rbfsetalgomultilayer(_model, radius, layers);

        alglib.rbfreport report;

        alglib.rbfbuildmodel(_model, out report);
heltonbiker
  • 26,657
  • 28
  • 137
  • 252

1 Answers1

1

Their documentation says "at 0-th (optional) iteration algorithm builds linear least squares model. Values predicted by linear model are subtracted from function values at nodes, and residual vector is passed to the next iteration". I think this is the problem, and I hope that indeed one can opt out of the 0th iteration without customizing their code.

rych
  • 682
  • 3
  • 10
  • 22
  • Wow, I've read this page countless times, but didn't noticed that. Gonna test it and post some feedback soon. By the way, even their forums are not that active... :( Thanks for now! – heltonbiker Oct 09 '14 at 12:40