1

How can I recursively grab a site using wget? [on hold]

view story
linux-howto

http://stackoverflow.com – Just picking a random site as an example, I have tried: wget --mirror --convert-links -p http://www.coursera.org But it does not properly download the site to a local version (HowTos)