[ubuntu] Download Website Links (but not content)

view full story

http://ubuntuforums.org – I am looking for a tool that can download links referred to by a website but not the website content. I simply need the list of urls dumped to a text file. It needs to be recursive within the domain I specify. Can anyone suggest a tool for doing this? I've looked at wget, but it doesn't seem to support this... (Hardware)