I am using RBF 2D interpolator for a cloud point Z = f(X,Y) of a section of terrain containing a mountain;
When I interpolate over a grid, the edges of the grid get "lifted", because the default value (returned when there are no nearby points) seems to be the AVERAGE of the Z coordinates. Then the image I get is not that of a mountain, but that of a mountain "sunk", imprinted on a plane with heigth well above zero.
Below, a cross-section of my cloudpoint is represented in green, and a continuous slice of interpolated points is represented in pink.
The desired behavior would be to get Z = 0 when no nearby points.
Is there a way I can tell the interpolator I want a zero value as default, even if cloud point does not intersect zero plane?
My current code is like this (C#):
alglib.rbfcreate(2, 1, out _model);
alglib.rbfsetpoints(_model, point_cloud);
alglib.rbfsetalgomultilayer(_model, radius, layers);
alglib.rbfreport report;
alglib.rbfbuildmodel(_model, out report);