I want to have a function which returns a random number between 2 given values. The problem, is that I want it to always "prefer" the lower values over the higher values, going up in a sort of "curve".
So, say I give it numbers 100 and 1000, it could give me any number between those 2 values... However it would give me between 100 and 200 far more than the 11.11% you would expect it to, instead it might give those values around 30-40% of the time, whilst the upper most values might only be given 2-4% of the time. Any ideas on how best to tackle this?
Language is C# but probably doesnt matter all that much.