I've been using a sitemapping tool to get a simple count of links below a specific url. The free trial has ended, so I figure that rather than paying $70 for what is very simple functionality, I should just use wget.
Here's what I have so far: wget --spider --recursive http://url.com/
I'm not sure, however, how to somehow calculate the number of links found from this. I'm also slightly nervous that this is doing what I want it to - will this only get links below the domain of url.com
?
Any ideas on how to accomplish this?
Thanks.