I'd like to be able to test some guesses about memory complexity of various command line utilities.
Taking as a simple example
grep pattern file
I'd like to see how memory usage varies with the size of pattern
and the size of file
.
For time complexity, I'd make a guess, then run
time grep pattern file
on various sized inputs to see if my guess seems to be borne out in reality, but I don't know how to do this for memory.
One possibility would be a wrapper script that initiates the job and samples memory usage periodically, but this seems inelegant and unlikely to give the real high watermark.
I've seen time -v
suggested, but don't have that flag available on my machine (running bash on OSX) and don't know where to find a version that supports it.
I've also seen that on Linux this information is available through the proc
filesystem, but again, it's not available to me in my context.
I'm wondering if dtrace
might be an appropriate tool, but again am concerned that a simple sample-based figure might not be the true high watermark?
Does anyone know of a tool or approach that would be appropriate on OSX?
Edit
I removed two mentions of disk usage, which were just asides and perhaps distracted from the main thrust of the question.