Home > Net >  Selenium can't find all elements by xpath
Selenium can't find all elements by xpath

Time:11-13

I use selenium to crawl data of a website and conduct the code below

from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC

chromeDriverPath = 'C:\Program Files (x86)\chromedriver.exe'
url = 'https://shopee.vn/Thời-Trang-Nam-cat.11035567?page=0'
driver = webdriver.Chrome(chromeDriverPath)
driver.get(url)

try:
    main_xpath = '/html/body/div[1]/div/div[3]/div/div[4]/div[2]/div/div[2]'
    main = WebDriverWait(driver,20).until(
        EC.presence_of_element_located((By.XPATH,main_xpath))
    )
    product_list = main.find_elements(By.XPATH,'./div[@data-sqe="item"]')
    i=1
    for i in range(0,len(product_list)):
        print(i)
        print(product_list[i].text)
finally:
    driver.close()

but it returns only the first 15 non-empty elements and the rest are empty although it seems to return number element of product_list exactly .
So, how can i find all elements of product_list ?

CodePudding user response:

   for i in range(1,int(len(product_list)/15)):
         driver.execute_script("arguments[0].scrollIntoView();", product_list[i*15])
         time.sleep(5)

Here's a hack I made to scroll 15 elements down and then wait a bit till it loads.

  • Related