Home > Mobile >  Zillow web scraping using Selenium & BeautifulSoup
Zillow web scraping using Selenium & BeautifulSoup

Time:05-21

I need to do web scraping of 3 pages of California on Zillow of rent houses and put all the data into a pandas data frame. I need to pull all the features of every listing - Address, City, number of bedrooms and bathrooms, size of the house, size of the lot, Year built, rent price, rent date

My code:

from bs4 import BeautifulSoup
import requests

import time
import os
import random
import re

!pip install selenium
!pip install webdriver-manager
from selenium import webdriver
from selenium.webdriver.chrome.service import Service
from webdriver_manager.chrome import ChromeDriverManager
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
driver = webdriver.Chrome(service=Service(ChromeDriverManager().install()))
from selenium.webdriver.chrome.options import Options
options = Options()
options.add_argument('--disable-blink-features=AutomationControlled')

import pandas as pd
import scipy as sc
import numpy as np
import sys



req_headers = {
    'accept': 'text/html,application/xhtml xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8',
    'accept-encoding': 'gzip, deflate, br',
    'accept-language': 'en-US,en;q=0.8',
    'upgrade-insecure-requests': '1',
    'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951 Safari/537.36'
}
response = requests.get("https://www.zillow.com/homes/for_rent/CA/house_type/",headers=req_headers)
print(response)
soup = BeautifulSoup(response.content, 'html.parser')
print(soup.prettify())


listing_urls = []

listings = soup.find_all("article", {"class": "list-card list-card-additional-attribution list-card_not-saved"})

for listing in listings:
    listing_url = listing.find("a")["href"]
    print(listing_url)
    listing_urls.append(listing_url)

I got stuck here - I get the following error:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
~\AppData\Local\Temp/ipykernel_24224/2055957203.py in <module>
      4 
      5 for listing in listings:
----> 6     listing_url = listing.find("a")["href"]
      7 
      8     print(listing_url)

TypeError: 'NoneType' object is not subscriptable

In addition, the code prints only 2 links for the whole page (every page has 40 listings of houses/apartments for rent)

Thank you ! :)

CodePudding user response:

Data is generating from external source via api and also store in script as json format under html comment. So you can easily pull all data using re or API.Here I use re module

import requests
import re
import json

r = requests.get('https://www.zillow.com/ca/rentals/2_p/?searchQueryState={"pagination":{"currentPage":2},"usersSearchTerm":"CA","mapBounds":{"west":-150.332039625,"east":-96.19141462500001,"south":21.05981671373876,"north":54.12119640028577},"regionSelection":[{"regionId":9,"regionType":2}],"isMapVisible":false,"filterState":{"fsba":{"value":false},"fsbo":{"value":false},"nc":{"value":false},"fore":{"value":false},"cmsn":{"value":false},"auc":{"value":false},"fr":{"value":true},"ah":{"value":true},"mf":{"value":false},"manu":{"value":false},"land":{"value":false}},"isListVisible":true,"mapZoom":4}',headers = {'User-Agent':'Mozilla/5.0'})

data = json.loads(re.search(r'!--(\{"queryState".*?)-->', r.text).group(1))

for item in data['cat1']['searchResults']['listResults']:
    listing_url= 'https://www.zillow.com'  item['detailUrl'] 
    print(listing_url)

Output:

https://www.zillow.com/b/morgan-park-vacaville-ca-5XjWBr/
https://www.zillow.com/b/renaissance-park-apartments-davis-ca-5XjQ6D/
https://www.zillow.com/b/alivia-apartments-whittier-ca-BwXXhk/
https://www.zillow.comhttps://www.zillow.com/homedetails/7845-Stewart-And-Gray-Rd-783498CC7-Downey-CA-90241/2065561183_zpid/
https://www.zillow.com/b/admirals-cove-alameda-ca-BPfvbS/
https://www.zillow.com/b/parkside-union-city-ca-5XjLjS/
https://www.zillow.com/b/coventry-park-roseville-ca-5XjQjF/
https://www.zillow.com/b/capitol-yards-west-sacramento-ca-5hHzYJ/
https://www.zillow.comhttps://www.zillow.com/homedetails/8950-Arrow-Rte-APT-145-Rancho-Cucamonga-CA-91730/2063289079_zpid/
https://www.zillow.com/b/twin-palms-apartments-san-jose-ca-5ZsGBN/
https://www.zillow.com/b/pavona-apartments-san-jose-ca-5XjTRx/
https://www.zillow.com/b/imt-pleasanton-pleasanton-ca-5XkG5W/
https://www.zillow.comhttps://www.zillow.com/homedetails/1516-Sylvan-Way-8328CE707-Lodi-CA-95242/2066702720_zpid/
https://www.zillow.com/b/vineyard-terrace-apartments-napa-ca-5XjN2m/
https://www.zillow.com/b/autumn-springs-livermore-ca-5XjRGt/
https://www.zillow.com/b/tamarack-woods-apartment-homes-brea-ca-5XjSZJ/        
https://www.zillow.com/b/serrano-apartments-west-covina-ca-5XnMXG/
https://www.zillow.com/b/castlerock-riverside-ca-5XkJLW/
https://www.zillow.com/b/lasselle-place-moreno-valley-ca-5XjThf/
https://www.zillow.com/b/the-crossing-at-arroyo-trail-livermore-ca-5XjR44/     
https://www.zillow.com/b/canyon-park-riverside-ca-5XjVV8/
https://www.zillow.com/b/55+-community-fountainglen-grand-isle-murrieta-ca-5XjT3M/
https://www.zillow.com/b/calavo-woods-apartments-spring-valley-ca-5XjSpC/      
https://www.zillow.com/b/axiom-tustin-tustin-ca-BWTZgR/
https://www.zillow.com/b/colonnade-riverside-ca-5XjxP7/
https://www.zillow.com/b/pinnacle-lemoli-hawthorne-ca-5j35wR/
https://www.zillow.com/b/verge-san-diego-ca-5hJ5ML/
https://www.zillow.com/b/palmia,-age-55+-luxury-apartments-fremont-ca-C2vgyw/https://www.zillow.com/b/sage-creek-south-simi-valley-ca-5XtP4x/
https://www.zillow.com/b/dublin-station-by-windsor-dublin-ca-63Pjjc/
https://www.zillow.com/b/wilshire-margot-los-angeles-ca-5XjVmT/
https://www.zillow.com/b/carefree-north-natomas-senior-apartments-sacramento-ca-5XjW9V/
https://www.zillow.com/b/tradition-carlsbad-ca-5XjQWM/
https://www.zillow.com/b/sherwood-at-iron-point-folsom-ca-5XjT5w/
https://www.zillow.comhttps://www.zillow.com/homedetails/139-Santa-Rosa-Dr-00749DD7E-Oceanside-CA-92058/2066712796_zpid/
https://www.zillow.com/b/casa-madrid-anaheim-ca-5XjKvK/
https://www.zillow.com/b/metro-gateway-riverside-ca-B8GcqC/
https://www.zillow.comhttps://www.zillow.com/homedetails/18427-Studebaker-Rd-C99807DD9-Cerritos-CA-90703/2066760882_zpid/
https://www.zillow.com/b/the-vineyard-luxury-apartments-petaluma-ca-5Xj7CH/    
https://www.zillow.com/b/renew-one59-san-leandro-ca-5XjxJb/

Using selenium you will never can extract 40 items from per page bcoz they restricted by comment and will get only 8 items.

import time
from bs4 import BeautifulSoup
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.chrome.service import Service
from webdriver_manager.chrome import ChromeDriverManager

driver = webdriver.Chrome(service=Service(ChromeDriverManager().install()))
driver.get('https://www.zillow.com/homes/for_rent/CA/house_type/')
time.sleep(5)
driver.maximize_window()

soup = BeautifulSoup(driver.page_source, 'html.parser')
info_list = soup.select('div.list-card-info > a')
for link in info_list:
    get_url =  link['href']
    print(get_url)
   

Output:

https://www.zillow.com/homedetails/176-W-Festivo-Ln-Mountain-House-CA-95391/69031126_zpid/
https://www.zillow.com/homedetails/1967-Linden-Ln-Milpitas-CA-95035/19473336_zpid/      
https://www.zillow.com/homedetails/3341-Idlewild-Way-San-Diego-CA-92117/2086677504_zpid/
/b/960-market-st-san-francisco-ca-9NJsbZ/
https://www.zillow.com/homedetails/2915-Tuna-Canyon-Rd-Topanga-CA-90290/95700373_zpid/        
https://www.zillow.com/homedetails/7332-Pacific-View-Dr-Los-Angeles-CA-90068/20045509_zpid/   
https://www.zillow.com/homedetails/2267-Filbert-St-San-Francisco-CA-94123/2063204947_zpid/    
https://www.zillow.com/homedetails/28145-Via-Princesa-APT-8-Murrieta-CA-92563/2082116207_zpid/https://www.zillow.com/homedetails/312-Birkdale-Way-Bakersfield-CA-93309/18957251_zpid/       
  • Related