Before the days of virtual memory, for example in V7 Unix, I don't believe there was a fixed limit for the stack. The stack grew downwards from the top of memory, and the heap grew upwards from the bottom (well, from the top of the text+data spaces, anyway), and as long as they didn't meet ("don't let the beams cross!" :-) ), you were fine.
What there was a limit on was how much the stack could grow at once. If you had a function that allocated, say a 1K local variable, you could call it recursively several dozen times before things went south. But if you had a function that allocated a 10K local variable, it tended to crash at once.
What was happening was that the OS was noticing whether your stack accesses exceeded the bounds of what had been allocated for the stack already. If you went past the end of the stack by a relatively small amount, the OS assumed that the stack was growing, and it allocated a little more for you. But if you tried to grow the stack by a lot all at once, what you got was something that looked like a stray pointer reference, midway between the top of the stack and the top of the heap, and you got a segmentation violation instead.
(This is from memory, so I could be wrong on the details. If I ever boot up the old PDP-11 again, I'll have to remember to test this out.)