I'm extracting data from DESWATER website, these data are then saved in CSV. to make a small example of the issue I have these 2 authors, one having a full text file the other doesn't. Hence, it will save the file to the wrong author.
So the CSV output looks like this:
Authors | File
First Author | Second File
Second Author | Third File
But I want the output like this:
Authors | File
First Author | 'No File'
Second Author | Second File
Third Author | Third File
Here is a small test code:
from bs4 import BeautifulSoup
import requests
import time
import csv
list_of_authors = []
list_of_full_file = []
r = requests.get('https://www.deswater.com/vol.php?vol=1&oth=1|1-3|January|2009')
# Parsing the HTML
soup = BeautifulSoup(r.content, 'html.parser')
#'Author'
s = soup.find('td', class_='testo_normale')
authors = s.find_all('i')
for author in authors:
list_of_authors.append(author.text.strip())
time.sleep(1)
#'FULL TEXT'
# find all the anchor tags with "href"
n=1
for link in soup.find_all('a', class_='testo_normale_rosso'):
if "fulltext.php?abst=" in link.get('href'):
# TO ADD
baseurl = 'https://www.deswater.com/'
Full_links=baseurl link.attrs['href'].replace('\n','')
list_of_full_file.append(f'file {n}')
n =1
time.sleep(1)
def Save_csv():
row_head =['Author', 'File Name']
Data = []
for author, file in zip(list_of_authors, list_of_full_file):
Data.append(author)
Data.append(file)
rows = [Data[i:i 2] for i in range(0, len(Data), 2)]
with open('data.csv', 'w', encoding='utf_8_sig', newline="") as csvfile:
csvwriter = csv.writer(csvfile)
csvwriter.writerow(row_head)
csvwriter.writerows(rows)
Save_csv()
This code will ultimately extract data from 279 pages, so I need the code to automatically detect that there is no Full Text for this author, so I can append it as 'No File'
See the reference of the correct matching in the website here. The first author doesn't have a full text file. Any Ideas?
CodePudding user response:
Try to change your strategy selecting the elements and avoid multiple lists if yo could not ensure same length.
Use css selectors
here to select all <hr>
that are the base for all other selections with find_previous()
:
for e in soup.select('.testo_normale hr'):
data.append({
'author': e.find_previous('i').text,
'file': 'https://www.deswater.com/' e.find_previous('a').get('href') if 'fulltext' in e.find_previous('a').get('href') else 'no url'
})
Example
from bs4 import BeautifulSoup
import requests
import csv
soup = BeautifulSoup(requests.get('https://www.deswater.com/vol.php?vol=1&oth=1|1-3|January|2009').content)
with open('data.csv', 'w', encoding='utf-8', newline='') as f:
data = []
for e in soup.select('.testo_normale hr'):
data.append({
'author': e.find_previous('i').text,
'file': 'https://www.deswater.com/' e.find_previous('a').get('href') if 'fulltext' in e.find_previous('a').get('href') else 'no url'
})
dict_writer = csv.DictWriter(f, data[0].keys())
dict_writer.writeheader()
dict_writer.writerows(data)
Output
author,file
Miriam Balaban,no url
W. Richard Bowen,https://www.deswater.com/fulltext.php?abst=XFxEV1RfYWJzdHJhY3RzXFx2b2xfMVxcMV8yMDA5XzEucGRm&desc=k@1@kfontk@13@kfacek@7@kk@30@kGenevak@6@kk@13@kArialk@6@kk@13@kHelveticak@6@kk@13@ksank@35@kserifk@30@kk@13@ksizek@7@kk@30@k2k@30@kk@2@kk@1@kik@[email protected]@13@kRichardk@13@kBowenk@1@kk@4@kik@2@kk@1@kbrk@2@kWaterk@13@kengineeringk@13@kfork@13@kthek@13@kpromotionk@13@kofk@13@kpeacek@1@kbrk@2@k1k@15@k2009k@16@k1k@35@k6k@1@kbrk@4@kk@2@kk@1@kak@13@khrefk@7@kDWTk@12@kabstractsk@4@kvolk@12@k1k@4@k1k@12@k2009k@[email protected]@13@kclassk@7@kk@5@kk@30@ktestok@12@knormalek@12@krossok@5@kk@30@kk@13@ktargetk@7@kk@5@kk@30@kk@12@kblankk@5@kk@30@kk@2@kAbstractk@1@kk@4@kak@2@kk@1@kbrk@2@k&id23=RFdUX2FydGljbGVzL1REV1RfSV8wMV8wMS0wM190ZmphL1REV1RfQV8xMDUxMjg2NC9URFdUX0FfMTA1MTI4NjRfTy5wZGY=&type=1
Steven J. Duranceau,https://www.deswater.com/fulltext.php?abst=XFxEV1RfYWJzdHJhY3RzXFx2b2xfMVxcMV8yMDA5XzcucGRm&desc=k@1@kfontk@13@kfacek@7@kk@30@kGenevak@6@kk@13@kArialk@6@kk@13@kHelveticak@6@kk@13@ksank@35@kserifk@30@kk@13@ksizek@7@kk@30@k2k@30@kk@2@kk@1@kik@2@kStevenk@[email protected]@13@kDuranceauk@1@kk@4@kik@2@kk@1@kbrk@2@kModelingk@13@kthek@13@kpermeatek@13@ktransientk@13@kresponsek@13@ktok@13@kperturbationsk@13@kfromk@13@ksteadyk@13@kstatek@13@kink@13@kak@13@knanofiltrationk@13@kprocessk@1@kbrk@2@k1k@15@k2009k@16@k7k@35@k16k@1@kbrk@4@kk@2@kk@1@kak@13@khrefk@7@kDWTk@12@kabstractsk@4@kvolk@12@k1k@4@k1k@12@k2009k@[email protected]@13@kclassk@7@kk@5@kk@30@ktestok@12@knormalek@12@krossok@5@kk@30@kk@13@ktargetk@7@kk@5@kk@30@kk@12@kblankk@5@kk@30@kk@2@kAbstractk@1@kk@4@kak@2@kk@1@kbrk@2@k&id23=RFdUX2FydGljbGVzL1REV1RfSV8wMV8wMS0wM190ZmphL1REV1RfQV8xMDUxMjg2NS9URFdUX0FfMTA1MTI4NjVfTy5wZGY=&type=1
"Dmitry Lisitsin, David Hasson, Raphael Semiat",https://www.deswater.com/fulltext.php?abst=XFxEV1RfYWJzdHJhY3RzXFx2b2xfMVxcMV8yMDA5XzE3LnBkZg==&desc=k@1@kfontk@13@kfacek@7@kk@30@kGenevak@6@kk@13@kArialk@6@kk@13@kHelveticak@6@kk@13@ksank@35@kserifk@30@kk@13@ksizek@7@kk@30@k2k@30@kk@2@kk@1@kik@2@kDmitryk@13@kLisitsink@6@kk@13@kDavidk@13@kHassonk@6@kk@13@kRaphaelk@13@kSemiatk@1@kk@4@kik@2@kk@1@kbrk@2@kModelingk@13@kthek@13@keffectk@13@kofk@13@kantik@35@kscalantk@13@konk@13@kCaCO3k@13@kprecipitationk@13@kink@13@kcontinuousk@13@kflowk@1@kbrk@2@k1k@15@k2009k@16@k17k@35@k24k@1@kbrk@4@kk@2@kk@1@kak@13@khrefk@7@kDWTk@12@kabstractsk@4@kvolk@12@k1k@4@k1k@12@k2009k@[email protected]@13@kclassk@7@kk@5@kk@30@ktestok@12@knormalek@12@krossok@5@kk@30@kk@13@ktargetk@7@kk@5@kk@30@kk@12@kblankk@5@kk@30@kk@2@kAbstractk@1@kk@4@kak@2@kk@1@kbrk@2@k&id23=RFdUX2FydGljbGVzL1REV1RfSV8wMV8wMS0wM190ZmphL1REV1RfQV8xMDUxMjg2Ni9URFdUX0FfMTA1MTI4NjZfTy5wZGY=&type=1
"M.A. Darwish, Fatima M. Al-Awadhi, A. Akbar, A. Darwish",https://www.deswater.com/fulltext.php?abst=XFxEV1RfYWJzdHJhY3RzXFx2b2xfMVxcMV8yMDA5XzI1LnBkZg==&desc=k@1@kfontk@13@kfacek@7@kk@30@kGenevak@6@kk@13@kArialk@6@kk@13@kHelveticak@6@kk@13@ksank@35@kserifk@30@kk@13@ksizek@7@kk@30@k2k@30@kk@2@kk@1@kik@[email protected]@13@kDarwishk@6@kk@13@kFatimak@[email protected]@13@kAlk@35@kAwadhik@6@kk@[email protected]@13@kAkbark@6@kk@[email protected]@13@kDarwishk@1@kk@4@kik@2@kk@1@kbrk@2@kAlternativek@13@kprimaryk@13@kenergyk@13@kfork@13@kpowerk@13@kdesaltingk@13@kplantsk@13@kink@13@kKuwaitk@32@kk@13@kthek@13@knucleark@13@koptionk@13@kIk@1@kbrk@2@k1k@15@k2009k@16@k25k@35@k41k@1@kbrk@4@kk@2@kk@1@kak@13@khrefk@7@kDWTk@12@kabstractsk@4@kvolk@12@k1k@4@k1k@12@k2009k@[email protected]@13@kclassk@7@kk@5@kk@30@ktestok@12@knormalek@12@krossok@5@kk@30@kk@13@ktargetk@7@kk@5@kk@30@kk@12@kblankk@5@kk@30@kk@2@kAbstractk@1@kk@4@kak@2@kk@1@kbrk@2@k&id23=RFdUX2FydGljbGVzL1REV1RfSV8wMV8wMS0wM190ZmphL1REV1RfQV8xMDUxMjg2Ny9URFdUX0FfMTA1MTI4NjdfTy5wZGY=&type=1
...