I'm using Selenium to scrape some data from this website: https://www.boerse-frankfurt.de/en/etf/ishares-core-s-p-500-ucits-etf-usd-acc
My script works but I need to impose a 5 seconds wait time to be sure that the data is actually available, otherwise I sometimes (randomly) get a blank result.
driver.get(url)
time.sleep(5)
bid = driver.find_element_by_xpath("/html/body/app-root/app-wrapper/div/div[2]/app-etp/div[2]/div[3]/div[2]/div/div[2]/app-widget-quote-box/div/div/table/tbody/tr[3]/td[1]").text
print(bid)
I tried to use Selenium's wait function, but the script runs quickly with a blank result:
driver.get(url)
try:
bid = WebDriverWait(driver, 10).until(
EC.presence_of_element_located((By.XPATH,"/html/body/app-root/app-wrapper/div/div[2]/app-etp/div[2]/div[3]/div[2]/div/div[2]/app-widget-quote-box/div/div/table/tbody/tr[3]/td[1]"))
)
bid = bid.text
print(bid)
finally:
pass
Any idea of how to make Selenium wait just the time necessary to scrape the data? Thanks
Update: A simple while cycle seems to work:
driver.get(url)
while driver.find_element_by_xpath(xpath).text == "":
pass
driver.find_element_by_xpath(xpath).text
Is there a better way to do that?
CodePudding user response:
To locate a visible element instead of presence_of_element_located() you need to induce WebDriverWait for the visibility_of_element_located() and you can use either of the following Locator Strategy:
driver.get(url)
try:
print(WebDriverWait(driver, 20).until(EC.visibility_of_element_located((By.XPATH, "/html/body/app-root/app-wrapper/div/div[2]/app-etp/div[2]/div[3]/div[2]/div/div[2]/app-widget-quote-box/div/div/table/tbody/tr[3]/td[1]"))).text)
finally:
pass