0

I've ran into several dead ends trying to come up with a result from a Google search. Essentially, I have a list of say 20 websites, all research institutes that occasionally update their websites/blogs with their latest findings.

I'm trying to either A - find a software that can check for new articles, then send me the title and link to the article, or B - write a script that check for new articles, then send me the title and link.

Any suggestions or software recommendations?

Brian
  • 21
  • 4

2 Answers2

0

You should first see if any of the sites have an RSS feed. That is fairly common, and will do the work for you.

I've built similar things. If the articles have a published date, you could keep a file or database of with the new

Jason Bellows
  • 339
  • 2
  • 7
0

The easiest way to save a web page is to download it to your computer. In Chrome, open the three-dot menu and select More Tools > Save page as. For Firefox, open the hamburger menu and choose Save Page As. On Safari, go to File > Save as or File > Export as PDF, and in Microsoft Edge, open the three-dot menu and choose More tools > Save page as.