I am web scraping a web table that looks like the follow:
| A | B | C | D |
1| Name | Surname| Route | href="link with more info"|
2| Name | Surname| Route | href="link with more info"|
3| Name | Surname| Route | href="link with more info"|
links = driver.find_elements(by='xpath', value='//a[@title="route detail"]')
So far so good, I get what I want.
Now I want to click on the collected links to gather the info in the subpage (which I know how) and return to the main page, move to the second row, and so forth.
for link in links:
# links = driver.find_elements(by='xpath', value='//a[@title="route detail"]')
link.click()
time.sleep(2)
driver.back()
The code above works for the first run and then throws the error:
Message: stale element reference: element is not attached to the page document
I tried to add various Wait, refresh etc etc but no success. I am using selenium 4.6.0. By the way, if I execute line by line outside of the for loop with Jupyter Notebook works.
I also added the find_element
line inside the door loop but still doesn't work.