I have a txt file list.txt that have bunch of urls, one every line. I want to set a cron job that wget/curl each url of the file once a day but does not save them in the computer.
I tried to run this on the terminal first.
wget -i /root/list.txt -O /dev/null
The command doesnt work understandably. It saves the list.txt to /dev/null, not the files from the urls inside list.txt. Then it says "no urls found".
So how do I do it properly? Wget each urls from a list but dont save anything on the computer?