0

Hi my problem is that my data set is monotonically increasing but towards the end the of the data it looks like it does below ,where some of the x[i-1] = x[i] as shown below. This causes an error to be raised in GSL because it thinks that the values are not monotonically increasing. Is there a solution, fix or work around for this problem?

the values are already double precision ,this particular data set starts at 9.86553e-06 and ends at .999999

would the only solution be to offset every value in a for loop?

0.999981
0.999981
0.999981
0.999982
0.999982
0.999983
0.999983
0.999983
0.999984
0.999984
0.999985
0.999985
0.999985
pyCthon
  • 11,746
  • 20
  • 73
  • 135
  • I guess I'm a little confused because double precision gives you far more than 6 significant decimal places. Single precision gives you 6. – Matt Phillips Sep 18 '12 at 15:23
  • it is double precision the problem is its a huge data set and the data gets closer and closer to 1 very slowly – pyCthon Sep 18 '12 at 15:24

2 Answers2

2

I had a similar issue. I had removed duplicates by a simple condition operator (if statement) and this did not affect the final result (checked by MatLab). Though, this might be a bit problem-specific.

Eugene B
  • 995
  • 2
  • 12
  • 27
1

If you've genuinely reached the limits of what double precision allows--your delta is < machine epsilon--then there is nothing you can do with the data as they are. The x data aren't monotonically increasing. Rather you'll have to go back to where they are generated and apply some kind of transform to them to make the differences bigger at the tails. Or you could multiply by a scalar factor and then interpolate between the x values on the fly; and then divide the factor back out when you are done.

Edit: tr(x) = (x-0.5)^3 might do reasonably well to space things out, or tr(x) = tan( (x-0.5)*pi ). Have to watch out for extreme values in the latter case though. And of course, these transformations might screw up the analysis you're trying to do so a scalar factor might be the answer--has to be a transformation under which your analysis is invariant, obviously. Adding a constant is also likely possible.

Matt Phillips
  • 9,465
  • 8
  • 44
  • 75
  • 1
    perhaps remove the duplicates then interpolate the removed values between each case as well.... – pyCthon Sep 18 '12 at 19:52
  • screwing up my analysis is what i am worried about perhaps I should post this on computational science overflow? – pyCthon Sep 18 '12 at 19:52
  • @pyCthon Never been to CS overflow--but if you analysis is sophisticated then maybe you should try, you haven't said anything about that here though. – Matt Phillips Sep 18 '12 at 20:06
  • I think you should say, "The x data aren't *strictly* increasing." Strictly increasing means each item is greater than the previous one. Monotonically increasing means each item is greater than *or equal to* the previous one. – Duncan MacIntyre Jun 23 '21 at 17:20