in my scraping script in python I'm in a situation where, while collapsing multiple buttons on a page, randomically a couple of pop up appear in the page and automatically the script fails.
These two pop ups are already managed and the beginning of the script but the website in a non systematic way dedices to show these two.
This is the part of the script interested where the script sleeps for 3 secs between one click to the other:
collapes = driver.find_elements_by_css_selector('.suf-CompetitionMarketGroup suf-CompetitionMarketGroup-collapsed ')
for collapes in collapes:
collapes.click()
sleep(3)
These are the two lines of the scipt where I click on the pop ups at the beginning
wait.until(EC.presence_of_element_located((By.XPATH, '/html/body/div[3]/div/div[2]/div[2]'))).click()
driver.find_element(By.XPATH, '/html/body/div[1]/div/div[4]/div[1]/div/div[3]/div[4]/div[3]/div').click()
DO you think there's a way to continue running the process being ready to click on these two without going on error?
CodePudding user response:
You can try to close the popups every time your code in for loop fails:
def try_closing_popups():
try:
wait.until(EC.presence_of_element_located((By.XPATH, '/html/body/div[3]/div/div[2]/div[2]'))).click()
driver.find_element(By.XPATH, '/html/body/div[1]/div/div[4]/div[1]/div/div[3]/div[4]/div[3]/div').click()
except:
pass
collapes = driver.find_elements_by_css_selector('.suf-CompetitionMarketGroup suf-CompetitionMarketGroup-collapsed ')
for collapes in collapes:
try:
collapes.click()
except:
try_closing_popups()
sleep(3)