I have the following Numpy array.
a = np.array([0.1, 0.9, 0.17, 0.47, 0.5])
How can I find a number such that, when multiplied by the array, turns every element into an integer?
For example, in this case, that number should be 100 because a
times 100 would make every element an integer.
>>> a * 100
array([10., 90., 17., 47., 50.])
I have tried taking 10 to the power of the longest decimal length. However, I need to find the smallest possible number (and it has to be greater than 0).
b = np.array([1.0, 0.5, 0.25, 0.75, 0.5])
As a result, that number should be 4 instead of 100 in this case.
>>> b * 4
array([4., 2., 1., 3., 2.])