I have a program in C# that asks the user for the 4 values. They are:
MinIndex, MaxIndex, MinValue, MaxValue
I want to be able to determine the value for any given index within the [MinIndex MaxIndex] range. The range of Indexs will not always be the same so i need to first find it out and use that value somehow.
As an example say the
MinIndex=250, MaxIndex=750 the range is 500; MinValue=0.025, MaxValue=0.254 range is 0.229.
If i do the valueRange / indexRage i get 0.000458.
This number enables me to take any index say "267" and multiply it by 0.000458 and i will get the value for that index.
However this is working from 0-500. How can i use my original indexs say [250-750] and have a single value that i can multiply to get that value for that index ie [298 * ?]
The calculation is linear, and because i know the value of the max index and the value of the min index i know there is a way to work out the rest.
Sorry if this is a stupid question but maths is not one of my string points
thankyou in advance