Is is possible to count the distinct digits in a number in constant time O(1)
?
Suppose n=1519
output should be 3
as there are 3
distinct digits(1,5,9)
.
I have done it in O(N)
time but anyone knows how to find it in O(1)
time?
I assume N
is the number of digits of n
. If the size of n
is unlimited, it can't be done in general in O(1) time.
Consider the number n=11111...111
, with 2 trillion digits. If I switch one of the digits from a 1
to a 2
, there is no way to discover this without in some way looking at every single digit. Thus processing a number with 2 trillion digits must take (of the order of) 2 trillion operations at least, and in general, a number with N
digits must take (of the order of) N
operations at least.
However, for almost all numbers, the simple O(N)
algorithm finishes very quickly because you can just stop as soon as you get to 10 distinct digits. Almost all numbers of sufficient length will have all 10 digits: e.g. the probability of not terminating with the answer '10' after looking at the first 100 digits is about 0.00027, and after the first 1000 digits it's about 1.7e-45. But unfortunately, there are some oddities which make the worst case O(N)
.
After seeing that someone really posted a serious answer to this question, I'd rather repeat my own cheat here, which is a special case of the answer described by @SimonNickerson:
O(1) is not possible, unless you are on radix 2, because that way, every number other than 0 has both 1 and 0, and thus my "solution" works not only for integers...
EDIT
How about 2^k - 1? Isn't that all 1s?
Drat! True... I should have known that when something seems so easy, it is flawed somehow... If I got the all 0 case covered, I should have covered the all 1 case too.
Luckily this case can be tested quite quickly (if addition and bitwise AND are considered an O(1) operation): if x is the number to be tested, compute y this way: y=(x+1) AND x
. If y=0
, then x=2^k - 1
. because this is the only case when all the bits needed to be flipped by the addition. Of course, this is quite a bit flawed, as with bit lengths exceeding the bus width, the bitwise operators are not O(1) anymore, but rather O(N).
At the same time, I think it can be brought down to O(logN), by breaking the number into bus width size chunks, and AND-ing together the neighboring ones, repeating until only one is left: if there were no 0s in the number tested, the last one will be full 1s too...
EDIT2: I was wrong... This is still O(N).
> list, that holds the digits in the list inside, and you can be in O(1) as long as your memory lasts...
– ppeterka Mar 12 '13 at 13:21