import bs4 as bs
import pickle
import requests
def find_and_save_CSI_300():
response=requests.get('https://en.wikipedia.org/wiki/CSI_300_Index')
soup = bs.BeautifulSoup(response.text,'lxml')
table = soup.find('tadle',{'class':'wikitable sortable'})
tickers = []
for row in table.findAll('tr')[1:]:
ticker = row.findAll('td')[0].text
ticker = ticker[:6]
tickers.append(ticker)
with open("CSI_tickers.pickle","Wb") as f:
pickle.dump(tickers,f)
print(tickers)
return tickers
find_and_save_CSI_300()
I'm having an issue with using the .findAll attribute. when I run the code below, it says object has no attribute findAll.
CodePudding user response:
The error you are facing is because of syntax error in previous line which says tadle
. It should be table
. That will get rid of the findall error. You may need to check the subsequent errors and fix based on need
CodePudding user response:
The main problem is that you need to fix your element typo to table
. Your code was looking for an element called tadle
and was returning None
to signal it was not found. When you then attempt to do use this value to do another findAll()
call it gives you the error.
Usually it is a good idea to test the return to ensure that item you are looking for is present.
If you are trying to extract the whole table, you could make the following changes:
import bs4 as bs
import pickle
import requests
def find_and_save_CSI_300():
response = requests.get('https://en.wikipedia.org/wiki/CSI_300_Index')
soup = bs.BeautifulSoup(response.text, 'lxml')
table = soup.find('table', {'class': 'wikitable sortable'})
tickers = []
for tr in table.findAll('tr')[1:]:
tickers.append([td.get_text(strip=True) for td in tr.findAll('td')])
with open("CSI_tickers.pickle", "wb") as f:
pickle.dump(tickers, f)
print(tickers)
return tickers
find_and_save_CSI_300()
This would give you a list holding each row of values:
[['2005', '923.45', '', ''], ['2006', '2,041.05', '1,117.60', '121.02'], ['2007', '5,338.28', '3,297.23', '161.55'], ['2008', '1,817.72', '−3,520.56', '−65.95'], ['2009', '3,575.68', '1,757.96', '96.71'], ['2010', '3,128.26', '−447.42', '−12.51'], ['2011', '2,345.74', '−782.52', '−25.01'], ['2012', '2,522.95', '177.21', '7.55'], ['2013', '2,330.03', '−192.92', '−7.65'], ['2014', '3,533.71', '1,203.68', '51.66'], ['2015', '3,731.00', '197.29', '5.58'], ['2016', '3,310.08', '−420.92', '−11.28'], ['2017', '4,030.85', '720.77', '21.78'], ['2018', '3,010.65', '−1,020.20', '−25.31'], ['2019', '4,096.58', '1,085.93', '36.07'], ['2020', '5,211.29', '1,114.71', '27.21']]