I have a JavaScript slider that outputs a value between 0 and 1 depending on its position. I want to convert that value to a value on another scale between say 100 and 1000, but based on the distribution of a set of data points between 100 and 1000.
The use case here is that I want the slider to be less sensitive to changes when there is a very close set of numbers. Eg... let's say the values in the scale are:
100, 200, 300, 500, 1000
The values 100-500 might take up, say, the first 80% of the slider due to their closer distribution, therefore making it easier to select between them.
There's clearly a mathematical function for calculating this, perhaps involving standard deviation and coefficients. Anyone know what it is?