Top 10 AI automation articles from Zapier's blog in 2025 (part 2)
You're reading part 2 of the Top 10 AI Automation Articles from Zapier's Blog in 2025 series, where I let AI do the ranking for me:
-
Check out Part 1 to see the ranking.
-
Check out Part 3 to learn about the ranking process and how I used the LLMs.
Let's dive into the code!
Retrieving the list of articles with metadata
To download the articles, I first looked for the pages listing those articles. I inspected the page
https://zapier.com/blog/all-articles/automation-inspiration/
using the DevTools. There, I found the information I wanted inside
a JSON object with the ID
__NEXT_DATA__
defined in
a <script>
tag.
In the console, I ran the following code to get the list of pagination URLs:
const nextDataElt = document.getElementById("__NEXT_DATA__");
const nextData = JSON.parse(nextDataElt.textContent);
nextData.props.pageProps.paginationUrls
// [
// "/blog/all-articles/automation-inspiration",
// "/blog/all-articles/automation-inspiration/2",
// "/blog/all-articles/automation-inspiration/3",
// "/blog/all-articles/automation-inspiration/4",
// ...
// "/blog/all-articles/automation-inspiration/35"
// ]
Looking closer at the JSON object, I saw that the articles on each
page are listed at
props.pageProps.articlesMostRecent
. Here's an example of that section in JSON format:
[
{
"topicId": "1H263FsCuJ3XaJKmEzQd8v",
"title": "4 ways to automate your bookmark manager",
"slug": "automate-bookmark-manager",
"categoryAssemblyTitle": "Automation inspiration",
"categoryAssemblyHref": "/blog/all-articles/automation-inspiration",
"image": {
"alt": "",
"maxWidth": 0,
"title": "automate-bookmark-manager-00-hero",
"url": "https://images.ctfassets.net/lzny33ho1g45/2m3MT9v0XfDAzFKdh0Dxrq/d63dd16b01bb673f15e76873a100fbe3/automate-bookmark-manager-00-hero.jpg"
},
"readTimeHandcoded": 0,
"readTimeComputed": 197,
"editorialLastPublishedDateComputed": "2025-07-22T05:00:00.000Z",
"editorialOriginalPublishDate": "2025-01-17T00:00:00.000-08:00",
"shortDescription": "Don't manually sort through saved items. Instead, automatically sync bookmarks between apps, convert them into tasks, and even generate curated RSS feeds for your audience. Here's how.",
"author": {
"name": "Michael Toth",
"email": "..."
},
"timeToRead": 197,
"description": "Don't manually sort through saved items. Instead, automatically sync bookmarks between apps, convert them into tasks, and even generate curated RSS feeds for your audience. Here's how."
},
{
"topicId": "1DyKa0v0vW8eebqJd6yDgx",
"title": "5 ways to automate Chatbase with Zapier",
"slug": "automate-chatbase",
"categoryAssemblyTitle": "Automation inspiration",
"categoryAssemblyHref": "/blog/all-articles/automation-inspiration",
"image": {
"alt": "",
"maxWidth": 0,
"title": "automate-chatbase-hero-00",
"url": "https://images.ctfassets.net/lzny33ho1g45/2u09rJVQpb9VuHv0Tpuu87/df8d4e01cc06473292389d170fad826b/Group_14838.jpg"
},
"readTimeHandcoded": 0,
"readTimeComputed": 249,
"editorialLastPublishedDateComputed": "2025-07-21T07:00:00.000Z",
"editorialOriginalPublishDate": "2023-10-10T00:00:00.000-07:00",
"shortDescription": "With Chatbase and Zapier, you can use automated workflows—called Zaps—to streamline everything from support tickets to onboarding. ",
"author": {
"name": "Elena Alston",
"email": "..."
},
"timeToRead": 249,
"description": "With Chatbase and Zapier, you can use automated workflows—called Zaps—to streamline everything from support tickets to onboarding. "
},
...
]
So, I decided to retrieve that article metadata for each of the 35
pages listed in the
paginationUrls
key. I
stored the combined data in a variable called
articles
and saved it to
the file
data/articles.json
.
To do all this, I used the
requests
package for the
HTTP requests,
beautifulsoup4
to grab
the
__NEXT_DATA__
string,
and json
to parse it
into Python:
# zapier_automation_inspiration.py
import requests
from bs4 import BeautifulSoup
import os
import logging
import json
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger("zapier-inspiration")
# 1. List of Articles
base_url = "https://zapier.com"
list_url = base_url + "/blog/all-articles/automation-inspiration/"
resp = requests.get(list_url)
soup = BeautifulSoup(resp.text, "html.parser")
next_data_tag = soup.find("script", id="__NEXT_DATA__")
if next_data_tag:
next_data = json.loads(next_data_tag.string)
pagination_urls = data.get("props", {}).get("pageProps", {}).get("paginationUrls")
else:
logger.error(f'No tag with id "__NEXT_DATA__" found in {list_url}.')
articles = []
for p in pagination_urls:
articles_url = base_url + p
resp = requests.get(articles_url)
soup = BeautifulSoup(resp.text, "html.parser")
next_data_tag = soup.find("script", id="__NEXT_DATA__")
if next_data_tag:
next_data = json.loads(next_data_tag.string)
if (articles_most_recent := next_data.get("props", {}).get("pageProps", {}).get("articlesMostRecent")):
articles.extend(articles_most_recent)
logger.info(f"{len(articles_most_recent)} articles found in {articles_url}.")
else:
logger.info(f"No articles found in {articles_url}.")
os.mkdir("data")
with open("data/articles.json", "w", encoding="utf-8") as f:
json.dump(articles, f, ensure_ascii=False, indent=2)
logger.info(f"{len(articles)} articles found.")
logger.info(f"The list of articles has been saved in data/articles.json file.")
To run this, use the following commands:
$ uv init
$ uv add requests beautifulsoup4
$ uv run zapier_automation_inspiration.py
Downloading articles and convert them to Markdown
Next, for each article:
-
I retrieved the article page using the
slug
key from the article dictionary. -
I extracted the HTML string inside the unique
<article>
tag withbeautifulsoup4
. This gives the main content. -
I converted the article content into markdown using
markdownify
, and removed some extra, noisy lines. -
I took some article metadata (including the title) from the article dictionary and added that as a header before the content.
-
Finally, I saved each article in the file
data/articles/<slug>.md
.
I also added
time.sleep(2)
between
requests to the Zapier server. This prevented SSL errors like this
one, which I kept getting if I didn't add a pause:
requests.exceptions.SSLError: HTTPSConnectionPool(host='zapier.com', port=443): Max retries exceeded with url: /blog/automate-activecampaign-with-zapier/ (Caused by SSLError(SSLEOFError(8, '[SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1000)')))
I encountered only two errors, with the articles "5 ways to automate X (formerly known as Twitter)" and "How to connect Facebook and Twitter for seamless social posting":
ERROR:zapier-inspiration:Failed to fetch 'https://zapier.com/blog/automate-twitter/': no <article> tag found.
ERROR:zapier-inspiration:Failed to fetch 'https://zapier.com/blog/connect-facebook-and-twitter-for-seamless-social-posting/': no <article> tag found.
These articles no longer exist due to changes in the X API and are now redirected to https://zapier.com/l/twitter-integration-faq.
# zapier_automation_inspiration.py
# ...
from markdownify import markdownify as md
import re
from datetime import datetime
import time
# ...
# 1. List of Articles
# ...
# 2. Download articles and convert to Markdown format
os.makedirs("data/articles/", exist_ok=True)
articles = [a for a in articles if isinstance(a, dict) and "slug" in a]
for article in articles:
slug = article.get("slug")
article_url = f"{base_url}/blog/{slug}/"
logger.info(f"Processing '{article_url}'.")
try:
resp = requests.get(article_url)
except requests.exceptions.RequestException as e:
logger.error(f"Failed to fetch {article_url}: {e}")
continue
time.sleep(2)
soup = BeautifulSoup(resp.text, "html.parser")
article_tag = soup.find("article")
if article_tag:
# Always use #-style Markdown headings
article_md = md(str(article_tag), heading_style="atx")
# Remove lines like these:
# [Try it](/webintent/create-zap?template=1379702)
# * 
article_md = "\n".join(line for line in article_md.splitlines() if not line.startswith("* , [AI-powered agents](https://zapier.com/agents), or [workflows](https://zapier.com/apps/categories/artificial-intelligence)) is a game-changer. It doesn't just knock out the tasks that eat into your day (and soul). It's the smartest way to scale your impact and bring your business true value. Here's how.
AI gives you automation superpowers. And Zapier helps you put those powers to use. We make it easy to turn your ideas into workflows so computers can do more work for you. [Sign up for AI beta features now.](https://zapier.com/ai)
### Table of contents
* [It keeps you (and your apps) from working in a silo](#silo)
* [It helps bring your automation ideas to life](#life)
* [It helps you scale your business and impact—faster](#impact)
* [It helps you focus on the work that matters](#focus)
## It keeps you (and your apps) from working in a silo
...
## It helps bring your automation ideas to life
...
## It helps you scale your business and impact—faster
...
## **It helps you focus on the work that matters**
...
## Power your business with AI and automation
...
*This article was originally published in September 2023. It was most recently updated in January 2025.*
That's all I have for today! Talk to you soon ;)
Built with one.el.