Home > database >  How to check next page button is available or not using python
How to check next page button is available or not using python

Time:09-12

I want scrap some events data from page then enter next page scrap the event data then next page, same process until next page button is not available. the question is how to implement in if..else condition once the button is not available.

Condition code

while True:
  for i in range(len(links)):
    scrapy()

  if #Write Condition to Check if Next page button is available : 
      driver.get('https://shoobs.com/find-events')
      next_page= '//*[@]/span[7]/a'
      link = WebDriverWait(driver, timeout=160).until(lambda d: d.find_element(By.XPATH,next_page))
      driver.execute_script("arguments[0].click();", link)
      scrapy()
  else:
      break

driver.quit()

Scrapy() function code

def scrapy():
    selector = '.event-poster'
    event_name = '.col-md-12 h1'
    links = WebDriverWait(driver, timeout=500).until(lambda d: d.find_elements(By.CSS_SELECTOR,selector))
    links[i].click()
    name_e = WebDriverWait(driver, timeout=500).until(lambda d: d.find_element(By.CSS_SELECTOR,event_name))
    print(name_e.text)
    driver.back()

Libraries and driver

from webdriver_manager.chrome import ChromeDriverManager
from selenium.webdriver.chrome.service import Service
from lib2to3.pgen2 import driver
from selenium import webdriver
from selenium.webdriver.common.by import By
#import time
#import csv
from selenium.webdriver.support.wait import WebDriverWait
driver = webdriver.Chrome(service=Service(ChromeDriverManager().install()))
selector = '.event-poster'
driver.get('https://shoobs.com/find-events')
links = WebDriverWait(driver, timeout=500).until(lambda d: d.find_elements(By.CSS_SELECTOR,selector))

CodePudding user response:

from selenium.common.exceptions import TimeoutException
    
try:
    nextpagebutton = WebDriverWait(driver, 20).until(EC.presence_of_element_located((By.XPATH, buttonxpath)))
except TimeoutException:
    print("No button available")

CodePudding user response:

Update you code.

while True:
  for i in range(len(links)):
    scrapy()
  ifNext = False
  try:
    driver.find_element(By.CLASS_NAME,'next')
    ifNext = True
  except:
    pass

  if  ifNext:
      driver.get('https://shoobs.com/find-events')
      next_page= '//*[@]/span[7]/a'
      link = WebDriverWait(driver, timeout=160).until(lambda d: d.find_element(By.XPATH,next_page))
      driver.execute_script("arguments[0].click();", link)
      scrapy()
  else:
      break
  • Related