wget -r -l 2 -p -k -erobots=off http://www.website.de/page
-r : recursive
-l 2: max depth 2
-p: download site prerequisites (background images etc.)
-k: convert links for local reading
-erobots=0: ignore robots.txt
contact@x21.ch
wget -r -l 2 -p -k -erobots=off http://www.website.de/page
-r : recursive
-l 2: max depth 2
-p: download site prerequisites (background images etc.)
-k: convert links for local reading
-erobots=0: ignore robots.txt
-erobots=off does not work on my system. this works:
wget -r -l 2 -p -k http://www.website.de/page
wget -r -l 3 -E -H -k -K -e robots=off -p https://home.zhaw.ch/~rumc/