0

I have a very special environment where I want to scrape the content of a Web page. Due to sophisticated authentification, I can only do it in Firefox (it only works with a plugin that's only available for Firefox). I write my scraper in the console (newly 'Scratchpad') and want to log all the data to the console (and copy and save it later).

I want to iterate over the area in a synchronous way and make us setInterval. But this break when I trigger the click() event because this reloads the page. How can I fix this?

let xxdata = ['dfdf', 'jul', 'joh'];
let i = 0;
setInterval(function() {
    // scrapes data form page
    const text = document.getElementById('AA').getElementsByClassName('BB')[0].childNodes[0].nodeValue;      
    const name = xxdata[i];
    document.getElementsByName('XX')[0].value = name;
    document.getElementsByName('YY')[0].click(); //reloads here
    i++;
}, 5000);

Any help (maybe a completely different approach) is highly approached.

Johannes Filter
  • 1,863
  • 3
  • 20
  • 25

1 Answers1

0

I ended up using Greasemonkey and saving data to the "localStorage".

Johannes Filter
  • 1,863
  • 3
  • 20
  • 25