1

wget and connection errors / timeouts

view story
linux-howto

http://serverfault.com – I was using wget the in the last week to recursively download a whole website of html pages. I used it this way: wget --recursive --no-clobber --page-requisites --html-extension --convert-links --domains XXXX.com --no-parent http://www.XXXX.com the issue is, since the downloading took couple of days, sometimes there were connection timeouts , network disconnections etc, and while it happened, seems like wget skipped the htmls it couldnt fetch, which is not good in this case. I wonder if there is a flag (been looking in the manpage to no avail...) to tell wget to keep on retrying failed fet (HowTos)