Scraping and Price Intelligence: 5 Steps to Find the Best Price for Your Product

I’ve set some resolutions at the beginning of the year: get vaccinated, resume travelling, start a passive revenue stream. The first two resolutions are works in progress. The last one was ignored up until now.

When you want to start a passive revenue stream, the consensus is to focus on something that you enjoy: I enjoy video games.

So, I want to start an Etsy print-on-demand business with gaming t-shirts.

The benefits of web scraping on Etsy

For those that live under a rock, Etsy is a community-driven global online marketplace where people can make, sell, and buy unique items.

The Etsy website gives three valuable pieces of information:

  • product prices
  • product reviews
  • related searches

We can use this information to create the ultimate market research for your niche or create the ultimate product: the price of the product will stay a little under the average price, all the negative reviews related to the product quality are taken into consideration, and products can be sold in bundles based on the related searches from the competitors.

Price-matching competing products

You can set up a cron job that runs every morning at 7 AM your time, that generates a report with the average prices for that day, targeting the products you’re interested in.

You can adjust the prices in your shop before your customers wake up and make your products more competitive.

Monitoring competitors

Let’s say your shop receives 10 orders a day. At one point, the number of orders starts to decrease. The reviews are positive and customers are happy with the product. You’re scratching your head and don’t know what to do.

If you use a scraping tool to monitor your competitors, you would find out that they reduced the price for their products by $5.

You’re currently selling fairly expensive products compared to your competitors. Knowing that is a huge leap in the right direction, since you’d identify the reason behind the drop in sales and you’d be ready to come in with a countermeasure.

Understanding the needs of the customers from reviews

Product reviews offer valuable information about the customer’s perception of the product. If one of your competitors is selling a nice looking product but with low-quality materials, you could easily create a similar product with the add-on of quality materials.

The customer is most likely to pick your product over the lower-quality alternative.

Web scraping in action

Because I don’t want to spend an entire week developing my scraping solution, I chose to use the Free package from the WebScrapingAPI service: it comes with 100M+ rotating proxies and 1,000 requests/month.

My biggest concern is for Etsy to block the IP address for scraping the website. WebScrapingAPI will have our back covered.

Step 1 — Find the information you need

We need to get an average price for Fallout t-shirts. For this, we scrape the first 3 pages from Etsy, extract the price and calculate the average price.

We scrape the following URLs:

From these pages, we get the products list ( — block > li), and for each product, we extract the current price ( .v2-listing-card__info .n-listing-card__price .currecncy-value).

Step 2 — Install the dependencies

Before we write some code, we need to install the following dependencies:

  • axios: a Promise based HTTP client for Node.JS
  • cheerio: a HTML markup parser for Node.JS with the syntax of jQuery

Step 3 — Scraping the site

Create a index.js file and paste the following code:

const cheerio = require(‘cheerio’);
const axios = require(‘axios’);
const api_key = ‘********************************’;
const urls = [
const api_url = `${api_key}&session=20210505&url=`;(async () => { let total_price = 0;
let products_count = 0;
for(let i = 0; i < urls.length; i++) { let response; try {
response = await axios.get(api_url + encodeURIComponent(urls[i]));
} catch (error) {
const $ = cheerio.load(;
const $products = $(‘ — block > li’);
// Parse the products list } console.log(`Average price: ${parseFloat(total_price/products_count).toFixed(2)}`);

This code will use the WebScrapingAPI API to scrape the Etsy pages and create a $products list.

Step 4 — Extract and calculate the average price

Replace // Parse the products listwith the following code:

$products.each((index, product) => {
const $product = $(product);
const $currencyValue = $product.find(‘.v2-listing-card__info .n-listing-card__price .currency-value’);
const price = parseFloat($currencyValue.eq(0).text());
if(price) {
total_price += price;

This code iterates over the products list, extracts the product price and converts it to a float, and stores the price and the product count for future use.

Step 5 — Test it

Run the script with the following command:

node index.js

If the planets are aligned and there’s no bug in the code, you should receive the average price for Fallout t-shirts. At this moment of time, the average price is $20.21. Not great, not terrible.

I will set the price for the products in my shop to $19.99.

Selling on Etsy is easy with a web scraper

This was a quick example of how powerful a scraping API is combined with a global eCommerce platform. With the provided code, you can find the average price for any product in a matter of seconds.

If you have any spare time on your hands, you can use the same code to scrape the product page for reviews and run a sentiment analysis on the reviews. This will give you important information about how the products from the competition are received by the customers.

A scraper can help you stay one step ahead of your competitors, set the right price for your product, and create new products by learning from your competitors’ bad reviews.

If you want to learn more about how a scraper can help a seller with price intelligence, or how a buyer can find the best price for a product, I’ve prepared for you an article on how data extraction can help you get the best auction deals.

If you liked the article, buy one of my shirts (once I actually launch the shop).

CEO & Co-Founder @Knoxon, Full Stack Developer