I've read quantities of info about using selenium and chromedriver. Nothing helped.
Then I tried undetected_chromedriver:
import undetected_chromedriver as uc
url = "<url>"
driver = uc.Chrome()
driver.get(url)
driver.quit()
However, there's such a mistake:
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)>
Guides in the net to avoid this mistake didn't help.
Maybe there's just a method to make the code wait 5 secs until the browser checking in process?
CodePudding user response:
So you will need to install a library called beautifulsoup4 and requests.
pip install beautifulsoup4
pip install requests
After that, try this code:
from bs4 import BeautifulSoup
import requests
html = requests.get("your url here").text
soup = BeautifulSoup(html, 'html.parser')
print(soup)
#use this to try to find elements:
#find_text = soup.find('pre', {'class': 'brush: python; title: ; notranslate'}).get_text()
Here is the beautifulsoup's documentation: https://www.crummy.com/software/BeautifulSoup/bs4/doc/
CodePudding user response:
Well,
I used Grap methods instead of requests.
Now it works. I think there's bypass method.
Grap documentation: https://grab.readthedocs.io/en/latest/