Not sure if this is the right place to ask this. In Cormen page 1056 I read that if the running time of an algorithm is O(k) and "k" is represented in unary i.e. a string of k 1s then running time of the algorithm is 0(n) where "n" is the input-size in bits and if "k" is represented as binary then as n=lg k+1 ,the running time of the algorithm becomes o(2^n).
So my doubt is then why "unary" representation won't be preferred in this case as it gives polynomial time in contrast to exponential in other case.