I have simple scrap application where I want to find all hrefs on matches:
public class MainClass {
public static void main(String args[]) {
WebDriver driver = null;
driver = new HtmlUnitDriver(BrowserVersion.CHROME);
// driver = new ChromeDriver();
driver.manage().timeouts().pageLoadTimeout(5, TimeUnit.MINUTES);
driver.get("https://www.tipsport.cz/live");
System.out.println("HTMLELEMENT: \n" + driver.getPageSource());
List<WebElement> elements = driver.findElements(By.xpath("//div[@id='events']//a"));
System.out.println("Size of elements: " + elements.size());
}
}
If I do this by this way via HTML unit driver I'm getting "Size of elements: 0"
. When I look into the consolse and try to find out element with id="events". It is there:
<div class="liveList" id="events">
When I run the same code with ChromeDriver, it finds elements, currently it was 51.
Why this row
List<WebElement> elements = driver.findElements(By.xpath("//div[@id='events']//a"));
doesn't do the same in ChromeDriver and HTMLUnitDriver, could you please help me what I'm doing wrong? I need to run it without chrome