I need to install a software into a server that is not connected to the internet. I'll be able to access a part of internet during one day to do the installation.
They are asking me the url/internet addresses that I'll be needing.
For the time being, my plan was
- Get all the wget and explicit links.
- Get the links for the OS packages with the following command - the example is with
curl
:apt-get download --print-uris curl
Nevertheless, I'm afraid that maybe other links are that I'm not aware of (for example when using other commands).
Is there a way to to do a clean installation and to log all the urls the server request throughout the installation ?
A came across this stack overflow question which is similar but not exactly the same and I can't make it work for my case. What I tried was to run the following command (in the background)
tcdump -i any -A -vv -s 0 | grep -e "GET" -e "Host:" -e "POST" > log &
But when using the command wget stackoverflow.com
there is nothing that gets logged. I have never used tcpdump
so maybe I'm doing something wrong.
Thank you very much for your time and help.