I want to write a Markov algorithm that converts numbers such as ||| (3) ||||| (5) and so on into, decimal numbers
So I'm studying algorithms for fun and I came over this problem . I seem to understand how to this for numbers smaller than 100, but what about bigger ones (like 3000 and so on, for example). This probably doesn't have any practical use, but I really wanna know how it is done.