I am doing a selenium project, to scrap all links in a web page and click on it, then get title and description of the news. I want to do this for all links in the home page - say bbc.com but once I click on a link and switch back, the home page got refreshed. and remaining links showing as stale element issue. Her is my code Any help would be much appreciated.
List<WebElement> allLinks = driver.findElements(By.tagName("a"));
for(int el = 0; el < allLinks.size(); el ) {
hrefs = allLinks.get(el).getAttribute("href");
try {
allLinks.get(el).click();
//newpage = driver.getWindowHandle();
try {
text = driver.findElement(By.xpath("*//div[contains(@class,'title-text')]")).getText();
description = driver.findElement(By.xpath("*//div[contains(@class,'title-text')]/ancestor::div[contains(@class,'title')]/following::h2[contains(@class,'subtitle')]")).getText();
noSwitch = false;
} catch (Exception e) {
// TODO Auto-generated catch block
System.out.println("Page not navigated");
}
} catch(StaleElementReferenceException e) {
System.out.println("Stale element issue");
text = allLinks.get(el).getText();
description = "No Description.";
}
System.out.println(hrefs " - " text " - " description);
utility.setCellData("first sheet", rowcnt, 1, text);
utility.setCellData("first sheet", rowcnt, 2, hrefs);
utility.setCellData("first sheet", rowcnt, 3, description);
rowcnt ;
if(!noSwitch) {
driver.switchTo().window(homepage);
}
}
CodePudding user response:
you have to recall your links, so reload allLinks like that:
List<WebElement> allLinks = driver.findElements(By.tagName("a"));
for(int el = 0; el < allLinks.size(); el ) {
#add this line
allLinks = driver.findElements(By.tagName("a"));