Turing machines can consider complexity in both space (memory space on tapes) and time.
There are classes such as PSPACE and EXPSPACE.
Further, we can present algorithms that are definitely in PSPACE.
http://www.springerlink.com/content/3hqtq11mqjbqfj2g/
However, when I actually code programs, some programs run faster than others, some programs have a smaller memory (RAM) footprint than others.
Presumably if I code a PSPACE algorithm to solve problem X and also an EXPSPACE algorithm to solve the same problem, the EXPSPACE program should use much more RAM than the PSPACE code.
Is there any way to estimate how much RAM will be involved, based on theoretical rating of the starting algorithm?