this may be a very basic question but I'm practicing web scraping a dynamic page with Selenium and I wanted to know if there was a way to only test just the web scraping of a table portion without having to run the whole code? Am I being a noob and just not seeing what I'm doing wrong? As I have many delays in my code to prevent errors when using selenium to click through buttons and sign in to get to the page where the table is to scrape. But it takes a lot of time as I test my web scrape over and over to constantly wait for the whole script to run.
CodePudding user response:
Added webdriver.wait
to your script and simplified it.
Note that you have to import WebDriverWait
and expected_conditions as EC
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
driver.get("webpage")
wait = WebDriverWait(driver, 20)
enter_username = input('Enter Username: ')
enter_password = input('Enter Password: ')
wait.until(EC.visibility_of_element_located((By.ID,"UserName"))).send_keys(enter_username) #userbox
wait.until(EC.visibility_of_element_located((By.ID,"Password"))).send_keys(enter_password) #password
driver.switch_to.default_content()
wait.until(EC.visibility_of_element_located((By.CLASS_NAME,"btn-primary"))).click() #email box
wait.until(EC.visibility_of_element_located((By.ID,"portlet"))).click() #smart search box
wait.until(EC.visibility_of_element_located((By.ID,"Search"))).send_keys("Search Results") #search box
try:
#Code to click captcha checkbox
#Code to solve recaptcha
except:
print("Recaptcha did not appear")
wait.until(EC.visibility_of_element_located((By.ID,"Submit"))).click() #submit box
#def save_Search_Results():
try:
***#BeautfiulSoup data This is where I'm testing to save data****
print(df)