By navigating to another page all collected by selenium web elements (they are actually references to a physical web elements) become no more valid since the web page is re-built when you open it again.
To make your code working you need to collect the links list again each time.
This should work:
import time
links = driver.find_elements(By.CSS_SELECTOR,'a')
for i in range(len(links)):
links[i].click() # visit page
# scrape page
driver.back() # get back to previous page, and click the next link in next iteration
time.sleep(1) # add a delay to make the main page loaded
links = driver.find_elements(By.CSS_SELECTOR,'a') # collect the links again on the main page
Also make sure all the a
elements on that page are relevant links. Since this may not be correct