i'm looking to download recusively my wordpress website into static using wget, the problem is that whenever i do that, it's using too much bandwith (3.5gb) even though i end up downloading 20mb which weird, so i'm looking to download using localhost, but when i use wget with localhost i only get the index page, now, we all know that wordpress saves the website url into database, so how am i supposed to download using localhost, i already set it up in apache configuration, i just want to download without using so many bandwith.
Tried using -N option to reduce bandwith but i keep getting error that files don't have last-modified header, so it is not helping..
This is the command i'm using :
wget -N --recursive --no-clobber --page-requisites --html-extension --convert-links --restrict-file-names=windows --domains website website -P /opt/
Thank you,
UPDATE 1 : Used /etc/hosts and linked the website to the localhost ip 127.0.0.1, but still it redirects back to the original ip and even then only downloads the index.page.
Is there a way to tell server to force add last-modified header to all wordpress files ?