What is the scale-up to your much larger desk?
Please note that there can be a significant difference between calculated consumption (of both time and memory) and the consumption estimated from the complexity. The previous answer (Thomas Philipp) is correct except for one detail:
t = c * (2 ^ n) ( + neglectable parts)
From one theoretical standpoint, this is a contradiction: if you care about the factor c, you also may care about the so-called "neglectable parts". Those drop out in a complexity determination, the O(2^N) world, where the only term that counts is the one that dominates the limit at +infinity.
In practical terms, check your set-up complexity and any looming secondary terms in your algorithm. For instance, one program I worked on had a straightforward O(n^2 log n) solution. There was O(n log n) pre-work and some O(n) overhead.
The problem we faced was that, to our consumers, the algorithm didn't appear to scale that way. For a small task, the overhead dominated. For a typical evaluation task, the pre-work and main body were of roughly equal time. For a true application, the main body showed its true colours and took over, although the first two stages then took longer than an eval task's entire run.
In short, the medium-term computations did not scale as an external viewer would expect, because of the high constant and coefficient values in the lower-complexity stages.