I am scraping a page where I am looking for all elements inside the following lists:
description = driver.find_elements(By.XPATH, "//div[contains(@class,'list') and h1/text()='Values']/ul[@class='bullet-list']")
But there is an unknown number of them on a given page, mostly between 3-10. There is also an unknown number of items in each list (the # of items in each list do not match). If I search for the list items as such:
description = driver.find_elements(By.XPATH, "//div[contains(@class,'list') and h1/text()='Values']/ul[@class='bullet-list']/li")
I get all the list items but now I dont know which list they belonged to.
What I want to do is something like this
description = driver.find_elements(By.XPATH, "//div[contains(@class,'list') and h1/text()='Values']/ul[@class='bullet-list']")
for x in range(len(description)):
values = driver.find_elements(By.XPATH, f"{description[x].xpath}/li")
for y in values:
b_list.append(y.text)
a_list.append(b_list)
So you can see why I need to know which list the items are coming from so they can be inside the same [].
I can use absolute xpath to solve this but its better if I can find a way with relative xpath.
CodePudding user response:
If you want to get items from multiple lists you can try
lists = []
description = driver.find_elements(By.XPATH, "//div[contains(@class,'list') and h1='Values']/ul[@class='bullet-list']")
for _list in description:
lists.append([li.text for li in _list.find_elements(By.XPATH, './li')])
print(lists)