Home > Back-end >  web page is not filtering items in selenium
web page is not filtering items in selenium

Time:06-21

I am trying to scrap data from a web page. But the problem is, when i am using the following code it's not filtering the results. The page is showing 120 items. but after i send_key and click(), it should return 73 items and then i should retrieve those 73 data. Does anyone know how to fix this? TIA

driver.find_element(by=By.XPATH, value='//*[@id="locationSearch"]').send_keys('Ontario, CA')
time.sleep(5)
driver.find_element(by=By.XPATH, value='/html/body/div/div/div[3]/div/div[3]/div[2]/div/div[2]/div/div[2]/div[1]/div/ul/li[1]').click()
time.sleep(5)
driver.find_element(by=By.XPATH, value='//*[@id="app"]/div[3]/div/div[3]/div[2]/div/div[2]/div/div[2]/div[2]/button').click()
time.sleep(5)

getListingUrls = driver.find_elements(
by=By.XPATH, value='//*[@id="catalog-listing"]/article/div/div[1]/div[2]/div/div/div/div[1]/div[1]/div[2]/h2/a') 

CodePudding user response:

import undetected_chromedriver as uc driver = uc.Chrome()

Thanks @furas https://stackoverflow.com/users/1832058/furas

  • Related