Let us execute some code as we already have Python & an IDE/Selenium installed. Now, setup a web crawler and move ahead.
#IMPORT THESE PACKAGES
import selenium
from selenium import webdriver
#OPTIONAL PACKAGE, BUY MAYBE NEEDED
from webdriver_manager.chrome import ChromeDriverManager
Letâs use the following line of code to define our web browser in Selenium:
#THIS INITIALIZES THE DRIVER (AKA THE WEB BROWSER)
driver = webdriver.Chrome(ChromeDriverManager().install())
Weâll use the search phrase âcarsâ to acquire the URL of a specific Google Trends search page. Letâs head over to scraping of Google Trends and look for automobiles:
As youâll see, there is a lot of information on this page. Letâs imagine we wanted to use scraped Google Trends to get the most common relevant search term:
Then, in your Python script, copy and paste the following code:
POSTS = driver.find_element_by_xpath(âDELETE THESE WORDS AND REPLACE WITH XPATHâ).text
Return to the Google Chrome browser now and save the XPath. To do so, right-click on the line in which the XPath appeared in the inspector tab, then right-click > copy > copy complete XPath.
Return to your Python script and insert the following between the two quotes:
POSTS = driver.find_element_by_xpath(â/html/body/div[2]/div[2]/div/md-content/div/div/div[4]/trends-widget/ng-include/widget/div/div/ng-include/div/div[1]/div/ng-include/a/div/div[2]/spanâ).text
print(POSTS)
That concludes our discussion. In our console, weâll now be able to see the top related query for that search term:
To accomplish this, return to your Python script and cut/paste the following code:
#THIS PRETTY MUCH TELLS THE WEB BROWSER WHICH WEBSITE TO GO TO
driver.get('COPY AND PASTE YOUR URL HERE')
This code should be replaced with the following URL from our Google Trends search page:
#THIS PRETTY MUCH TELLS THE WEB BROWSER WHICH WEBSITE TO GO TO
driver.get('https://trends.google.com/trends/explore?q=cars&geo=US')
Next, weâll use the web browser to get the HTML property of the topmost related query. To begin, learn how and when to enable your development settings in your internet browser and turn them on. Next, right-click on the real top search engine (as shown below) and select inspect; you should see something similar to this:
Here is the final execution of the project
#IMPORT THESE PACKAGES
import selenium
from selenium import webdriver
#OPTIONAL PACKAGE, BUY MAYBE NEEDED
from webdriver_manager.chrome import ChromeDriverManager
#THIS INITIALIZES THE DRIVER (AKA THE WEB BROWSER)
driver = webdriver.Chrome(ChromeDriverManager().install())
#THIS PRETTY MUCH TELLS THE WEB BROWSER WHICH WEBSITE TO GO TO
driver.get('https://trends.google.com/trends/explore?q=cars&geo=US')
POSTS = driver.find_element_by_xpath('/html/body/div[2]/div[2]/div/md-content/div/div/div[4]/trends-widget/ng-include/widget/div/div/ng-include/div/div[1]/div/ng-include/a/div/div[2]/span').text
print(POSTS)
Once the process gets completed, you have finally scraped some information from Google Trends without using an API. The above mentioned is a script used for scraping Google trends.
Scraping Intelligence Editorial Team is a collective of data specialists, analysts, and researchers with expertise in web scraping, data extraction, and market intelligence. The team produces well-researched guides, actionable insights, and industry-focused resources that help businesses unlock the value of data and make informed, strategic decisions.
Explore our latest content pieces for every industry and audience seeking information about data scraping and advanced tools.
Learn how to extract eBay product data using Python with step-by-step scraping methods, parse HTML, pull prices and export item details to JSON.
Learn how to scrape bank and credit card offers from retailer websites to extract deals, cashback, reward points, promo codes & EMI offers with ease.
Deliveroo Data Scraping for UK restaurant menus, prices, and reviews. Get valuable insights for competitor price tracking and market trends.