0

I'd like to scan all the url's on my website as well as get the files in them, but the thing is, there are too many for me to do this manually so how would I do this?

I'd like it formatted anyway as long as there is some type of order to it.

Eg: URL/FOLDER URL/FOLDER/FILE URL/FOLDER/FILE2 URL/FOLDER2/FILE

All in a file like a .txt

How would I do that?

1 Answers1

2

Try ls passing the -R switch. It lists subdirectories. Here is an example:

ls -R /path/to/whatever > folders.txt 
dastergon
  • 171
  • 3
  • Sorry, its kinda late. But that was perfect. I was busy fixing everything. –  Jan 23 '13 at 00:07