Home > OS >  Why do I keep getting SSL: CERTIFICATE_VERIFY_FAILED] error when using API's?
Why do I keep getting SSL: CERTIFICATE_VERIFY_FAILED] error when using API's?

Time:08-17

I realized that basically every time I try to use an API based library (quandl, pandas datareader or just a normal Google API for search engine), I keep getting the

exceptions.SSLError (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:997)')))

I have tried updating certifi, setuptools, requests and basically everything, but nothing helps. I have downloaded the new cacert.pem file and replaced the old one, but it is still giving me the error.

Here is a code example and the error I am getting:

import pandas as pd
import pandas_datareader.data as web
import matplotlib.pyplot as plt

start_date = "2020-01-1"
end_date = "2020-12-31"

data = web.DataReader(name="TSLA", data_source='yahoo', start=start_date, end=end_date)
print(data)

requests.exceptions.SSLError: HTTPSConnectionPool(host='finance.yahoo.com', port=443): Max retries exceeded with url: /quote/TSLA/history?period1=1577847600&period2=1609469999&interval=1d&frequency=1d&filter=history (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:997)')))

I am using:

  • Python 3.10.6 (I was using Python 3.8 earlier and it was working about two weeks ago but after I updated my computer it stoped, so I updated to the latest version of Python)

  • pip 22.2.2

CodePudding user response:

Try using session from requests

This worked for me:

import requests
import pandas as pd
import pandas_datareader as web

start = datetime.datetime(2018, 1, 1)
end = datetime.datetime.now()

session = requests.Session()
session.verify = False

df = web.DataReader("TSLA", 'yahoo', start, end, session=session)
  • Related