Web Scraping and Real Estate: A Match Made in Heaven

Dan Suciu
7 min readApr 14, 2021

Real estate changed at once with the upswing of the internet, as well as many other businesses that initially required oodles of bureaucracy. Technology now allows realty monopolies like CBRE to scrape and manage their listings worldwide.

Automation has seeped into fields like email marketing and customer support, so why not optimise property listings online to attract a larger pool of buyers?

Google Ads and many more eCommerce platforms have suggested that almost half of the buyer inquiries happen digitally. If you find yourself looking up a new home, you will most likely find the same property being advertised on multiple websites, hence generating maximum exposure.

The majority of realtors have not been trained to be so technologically astute, so manually researching the market has now come close to looking for Waldo. Web scraping provides a series of boundless possibilities that eliminates confusion amongst the sweeping amounts of unorganised information.

Web scraping to the rescue

Web scraping ensures that you have a vast amount of precise and credible real estate info.

As a result, you can use this insight to boost your company and earn substantial rewards.

Scraping real estate listings off the Internet through a data extraction service provider places you in a position to offer high-performance real estate services and solutions for the good of your customers.

Scraping the web generates metrics that the realtor can explore further to assess sales and future customers. The following parameters can be obtained by using web scraping:

  • Property type
  • Sale price
  • Location
  • Size
  • Amenities
  • Monthly rental price
  • Parking spaces
  • Property agent

Having all of the above information at their fingertips helps realtors make smarter decisions, communicate more effectively, and sell more quickly and profitably. Web scraping’s involvement in retail is still in its early stages, but its promise is considerable.

Use cases

Accurately deduce your property’s value

Selling your childhood home? Don’t fumble around with pricing by shooting in the dark. There is room for profit after doing your research and eyeing similar properties and their value. Market research can now be quick and efficient so you get the best deal there is.

Maximize rental yield & long-term sustainability

Before investing in real estate, the most important thing to remember is the rental yield. You will find out which properties in any neighbourhood have the best rental yield by scraping data from real estate websites. Scraping also shows which property types are most common in a given region and have the highest return on investment (ROI).

Collecting and reviewing this data would allow the company to make well-informed decisions based on high-quality data. It’s also a perfect way to maintain a strategic lead on your competition.

Track vacancy rates

Investing in an abandoned building can be risky. It’s critical to look at property details and suburbs with more rental listings to find out what might be the cause of increased vacancy rates for a particular property.

With enough scraped data, companies can tell the differences between diamonds in the rough and ticking timebombs.

Invest like a pro

If you want to consciously invest in 2021, maybe you don’t need to listen to your parents’ advice from the ’80s. There is no room left for vague information that has not been updated since Google was founded.

Web harvesting allows you to obtain fresh real estate data that paves the way for investment analysis on listing websites.

Here’s how you do it

Let’s assume that we want to scrape Realtor to find properties with a minimum of two baths and two bedrooms in Miami and compare them to see which one is a better option. The following steps will guide you through this process.

We’ll start by using a data extraction tool. Of course, you can still do this by building your own web scraper, but it’ll be time-consuming and won’t have as many features as a prebuilt tool. You can read more about this topic in this article.

I will stick to WebScrapingAPI because it has some great features, like avoiding IP blocks, rotating proxies, geotargeting, concurrent requests etc. In addition to that, we will use NodeJS and JSDOM.

1. Create a WebScrapingAPI account

This step is an easy one, and there’s no need to be concerned about pricing because you have 1.000 free API calls so that you can try it out! After you register and validate your account via email, we’ll move on to the next step.

2. API key

You’ll need your API Key to authenticate with the API, which you can get from your dashboard.

You can reset your private key at any time by pressing the “Reset API Key” button if you think it has been stolen.

3. Integrate into your application

3.1 Required tools

To make an HTTP request, we’ll need to install the got package and the jsdom to parse the HTML returned from the request.

npm install got jsdom

3.2 Set the API request parameters

const params = {
api_key: “XXXXXXXXXXXX”,
url: “https://www.realtor.com/realestateandhomes-search/Miami_FL"
}

3.3 Make the request

const response = await got(‘https://api.webscrapingapi.com/v1', {searchParams: params})
console.log(response.body)

After making the request, we already see that the page is returned in HTML format.

3.3 Inspect the source code

We can explore the page and search for the elements we’re interested in using the Developer Tools. For inspecting an element, right-click on the element and select the Inspect option.

We found that the property element has a class of component_property-card. Some basic details about the listing are located inside this element in a wrapper called property-wrap.

3.5 Parsing the HTML

We need to parse the request result in order to manipulate it because the result is returned in an HTML format. JSDOM will do the job for us!

const {document} = new JSDOM(response.body).window

3.6 Processing the results

We search for all the elements with class component_property-cardand iterate over them. Inside each listing, we look for corresponding elements to the property address, price, size, beds and baths; if these properties exist, we store them in an object. After we finish with the current listing, we push it into an array.

Iterating through all the listings in the array, we only show the ones corresponding to our conditions (at least two baths and two beds).

We wrap the code with an async function and it should look like this:

const {JSDOM} = require(“jsdom”);
const got = require(“got”);
(async () => {
const params = {
api_key: “XXXXXXXXXX”,
url: “https://www.realtor.com/realestateandhomes-search/Miami_FL"
}
const response = await got(‘https://api.webscrapingapi.com/v1', {searchParams: params})
const {document} = new JSDOM(response.body).window
const listings = document.querySelectorAll(‘.component_property-card’)
const properties = []
listings.forEach(el => {
if (el) {
const price = el.querySelector(‘span[data-label=”pc-price”]’)
const beds = el.querySelector(‘li[data-label=”pc-meta-beds”]’)
const baths = el.querySelector(‘li[data-label=”pc-meta-baths”]’)
const size = el.querySelector(‘li[data-label=”pc-meta-sqft”]’)
const address = el.querySelector(‘div[data-label=”pc-address”]’)
let listing = {}
if (price) {
listing.price = price.innerHTML
}
if (beds && beds.querySelector(‘.meta-value’)) {
listing.beds = beds.querySelector(‘.meta-value’).innerHTML
}
if (baths && baths.querySelector(‘.meta-value’)) {
listing.baths = baths.querySelector(‘.meta-value’).innerHTML
}
if (size && size.querySelector(‘.meta-value’)) {
listing.size = size.querySelector(‘.meta-value’).innerHTML
}
if (address && address.textContent)
listing.address = address.textContent
properties.push(listing)
}
})
properties.forEach((property) => {
if (property.beds >= 2 && property.baths >= 2) {
console.log(“Address: “ + property.address)
console.log(“Price: “ + property.price)
console.log(“Beds: “ + property.beds)
console.log(“Baths: “ + property.baths)
console.log(“Size: “ + property.size + “ sqft \n”)
}
})
})();

4. Voila!

In a matter of seconds, you have all this data at your fingertips; it’s up to you how do you use it.

Congratulations! You have scraped the web page successfully.

Using a data extraction tool is easy

Let’s hope you’ve realised how site scraping is transforming the real estate market by now.

As a result, those who are able to adapt to ever-changing technology trends and exploit data will have the greatest chance of setting the industry’s pace.

Web scrapers come in a variety of shapes and sizes. The best thing you can do is give the API a try before buying. Most of the products offer free options, be it a trial or some free calls/credits to try it out. Some of them are intended for non-technical users, while others need programming skills.

I’ve prepared for you a small list of articles that will help you make an informed decision and use web scraping at its full potential:

--

--

Dan Suciu
Dan Suciu

Written by Dan Suciu

CEO & Co-Founder @Knoxon, Full Stack Developer

Responses (1)