1

How do I scan my folders for a website? Like a crawler?

view story
linux-howto

http://serverfault.com – I'd like to scan all the url's on my website as well as get the files in them, but the thing is, there are too many for me to do this manually so how would I do this? I'd like it formatted anyway as long as there is some type of order to it. Eg: URL/FOLDER URL/FOLDER/FILE URL/FOLDER/FILE2 URL/FOLDER2/FILE All in a file like a .txt How would I do that? (HowTos)