I have a tabulated data with columns: x, a, b, c, ... x has finite range. I would like to find/learn a function f(a,b,c, ...) such that, if we plot all points (x, y = f(a,b,c, ...)) then the line fitting all such points has positive slope. That is, as x increases, f(a,b,c, ...) also increases. What is the best and simplest way to find such f. In other words, for any two rows in the table (x1, a1, b1, c1, ...) and (x2, a2, b2, c2, ...) we have, if x1 < x2 then f(a1, b1, c1, ...) < f(a2, b2, c2, ...).
One can see this as a "relaxed" version of predicting x in supervised learning.