Opinions, thoughts, and processes of others, that are my own.
Content Marketing & UX Peak Performance
Tangential Content: Finding keyword ideas no one else (in your niche) is thinking about
Adopted from Out Of The Box SEO webinar, 10/19/2023
Tangential content involves producing content that isn’t directly related, but sits adjunct to your brand.
Use it to increase bottom line by:
a) building brand awareness and trust to a wider audience
b) creating more emotional content and better align with customer lifestyle
c) creating a more diverse portfolio of content
d) improving your business’ knowledge graph
e) helping to better understand customer unmet needs, i.e. guide new product ideas or post-purchase informational articles
Step 1: Establish buyer persona
“To think about these tangential ideas, you need to put your mind into the mind of your customer.”
Use Similarweb for demographics.
Step 2: Create a mind map
“Splurge ideas [organically] based on customer profile.”
Step 3: Find the data to support your ideas
a) Search your keywords
b) Export keywords data from search results
Although keyword research tools may show zero search volume for the keyword, the individual URLs will rank for additional keywords. This is the data being exported.
Step 4: Seek additional sources of inspiration
“To give a monthly search volume, keyword tools rely on historical data. Consequently, traditional keyword research methods are unlikely to surface emerging trends that don’t have historical data yet.”
“Dynamic forums like Reddit and Quora have a huge user base; all of whom are asking questions that many of them cannot find the answers to elsewhere online.”
a) Look for opportunities where keyword research tools have zero volume, but the keyword has good engagement within forums.
b) When searching the keyword, look for examples that are primarily being addressed by forums within search results.
These are indicative of questions that are being frequently asked, but have not been picked up by keyword research tools.
- Scrape all related ideas within the forum.
- Use AI to make the questions less “chatty”, i.e. from colloquial into a more formal structure.
- Pull related questions to your keyword (AlsoAsked).
- Download search volume for all keywords in your dataset (Keywords Everywhere).
- Combine CSVs / files.
Step 5: Cluster your keywords
“Clustering keywords helps us understand what pages to create and identify repeated questions in ‘zero volume’ keywords.”
Individual keywords may be zero volume, but when clustered paint a better picture in terms of its opportunity.
Understanding Google’s Helpful Content Update
If your traffic has been declining, it might be the result of the Helpful Content System assessing a large proportion of your content as generally not the most helpful result when it comes to meeting the searcher’s primary need.Google’s Helpful Content & Other AI Systems May Be Impacting Your Site’s Visibility
This classifier is essentially saying, “In general, content from this site is not the most helpful option for searchers when compared to others.”
Each time I look at content Google’s algorithms started to prefer and compare it against the site that started to struggle, it is clearly more helpful at getting the searcher to their answer and meeting their need.
A common mistake I encounter in website audits of low-ranking pages is a tendency to overwrite.
There is nothing wrong with creating comprehensive content. But there’s a point at which the content veers off-topic.
That phrase, “How to Choose a Fishing Kayak,” encompasses multiple subtopics (as outlined above).
Some of the broader fishing kayak topics are:
- – Kayak paddles.
– Trolling motors.
– Fish finders (sonar units).
– Short handle fishing nets.
– Gear crates.
All of those subtopics are relevant to fishing kayaks. But they are not relevant to the topic of “How to Choose a Fishing Kayak.6 SEO Concepts To Focus On Right Now
They are relevant for a different topic such as: “What Gear Does a Fishing Kayak Need?
When writing content or auditing a client’s content, always keep an eye out for topic drift.
In a nutshell, the SEO Avalanche technique refers to the process of creating content by targeting keywords according to your site’s “traffic tier.”
This theory aligns with the modern keyword research approach of building topical authority and relevance.
Creating content optimized for keywords within your traffic tiers about
the same topic allows you to scale your website performance reasonably.
By sticking to your traffic tiers when researching keywords, you avoid getting too ambitious with the keywords you want to optimize.
As a result, you can expect to move up the traffic tiers gradually.Modern Keyword Research, Charles Floate
Understanding A Topic Cluster Model
When it comes to navigation, [company] has implemented a clever approach by placing the entry point to the guide’s pillar page in the footer.
This strategy serves as an additional element that helps in discoverability, internal linking, and emphasizes the importance of the guide within the website.Great Topic Cluster Example to Structure your Website | thruuu
Reducing Content Cannibalization with Keyword Clustering
Problem: Duplicated intent has led to keyword cannibalization.
Solution: Using software (Keyword Insights) to cluster keywords in order to identify which ones have the same intent.
- Conduct keyword research to add to the initial data set (in case relevant keywords have been initially missed).
- Update new data set to keyword clustering software.
- Redirect keywords (and matching URL) that are now grouped together into a single, core URL (choosing the URL that ranks best or has the best metrics).
Freshness Distance – Pt. 1
For most, updating content once per year will do 99% of the work, both in preventing decay and acting on content starting to. It’s also easy to operationalize.”
If this number is larger than a year or close to it, just default to once a year. If it’s narrower – such as three months – you need to update more frequently. Set that timeline and operationally stick to it instead of trying to solve for content decay.
In SEO, we should be proactive not reactive to win.
Freshness Distance – Pt. 2
Reclaiming Historical 404s (and backlinks) with WebArchive.org CDX Server API
WebArchive.org’s CDX Server API can easily surface historical URLs that now resolve 404.
Using Screaming Frog to crawl these URLs and view the redirect chain, they can then be prioritized to fix sorted by number of referring domains pointing to them.
- Query the server and set your filters:
- Crawl these URI in List Mode in Screaming Frog and export redirect chain.
- Filter for Status Code: 404 and copy/paste this list into URL Profiler with ahrefs API enabled.
Adapted from Fixing historical redirects using Wayback Machine APIs
I’ve seen this one tactic — fixing historical redirects — make the difference between being just another competitor out there to being a real player at the top of the market.
Bonus: regex for removing image files and non-HTML files from your list:
A Take on Writing and Generative AI
End time included (37:12)
A Take on Writing and Generative AI, Pt. 2
…one thing I think is often true with AI generated content, is that in time it can become less helpful than other original content on the SERP which can lead to bounces off your page, and that is something Google grades highly in ranking.
I think that having topic suggestions inside your editor is a really powerful feature that enables you to start writing straight away instead of having to go to the SERP for the keyword and reading tons of other articles just to get up to speed yourself.
The ability to generate content when your own brain is stumped is really good, and the output is decent, but you should probably always apply your own manual edits.
The key pitfall of using AI is when you get lazy with it.
If you use AI in an intelligent way, like as an actual assistant rather than a pure source, then you’re more likely to produce better content that can continue to rank.Niche Campus, July 20, 2023
A Take on Unique Content vs. “Content Grading Tools”
- Unique perspectives (What does our client think about the topic? What can we find on forums like Reddit, Quora, or Twitter?)
- SME Quotes (Who can we build relationships with who can provide recurring quotes? What do our clients have to say about a topic – e.g. How to x…? Who can we quote without needing to reach out to them because they’ve already discussed the topic publicly?)
- Explanation by analogy
- Storytelling (Customer stories or stories already shared online)
- Data (1st party or 3rd party)
- Charts and graphs (Datawrapper, etc)
- Custom illustrations, manipulated photos, or AI photos
- Custom development
- Tools (Calculator, survey where results are revealed, etc)
- “Common complaints” — touch on what the opposing view is saying about the topic.
- Key takeaways at the beginning of the article, for tl;dr visitors.
- Placing the “Data (1st party or 3rd party)” at the beginning of the article.
- (in order to promote passive link building — readers will see the citable data above-the-fold, lending to “zero-time-to-value”.
Qualifications for Choosing Content Grading Tools (and why)
- Allow you to pick which page 1 sites to use for comparison.
- Have the ability to compare content both BEFORE and AFTER publication.
“SEO is a game of match and exceed.”
Google has been serving results based on intent buckets for at least 3.5 years now.
This means that they’ll return a mix of intents for most queries to increase/better their chance of getting the click (hyper-specific long-tail queries where the searchers’ intent is obvious don’t fit this scenario).
Most pages can be slotted into one of three intents:
Whichever page classification you have, you should only compare yourself to the similar pages on page 1.
You might outrank an eCommerce result with your info/review page, but you won’t replace it.
To rank page 1, we have to be top 3 for our bucket. But it also means that to once we get the top 30, there are only ten sites we’re competing with for page 1 real estate.https://grindstoneseo.com/blog/how-i-seo/
Quick Keyword Cleaning w/ Google Apps Script
Clean up geo modified keywords (or any rows with specific keywords) from keywords research data in Sheets before a clustering project:
This Google Apps Script is intended to delete rows from the currently active sheet in a Google Spreadsheet if any cell in the row contains one of the specified keywords. The keywords are contained in three different arrays (
cities), which are then combined into one
The function first retrieves all data from the active sheet, and then iterates through each cell of each row to check if it contains any of the keywords. If a keyword is found, the row number is added to the
rowsToDelete array. Finally, the script iterates through the
rowsToDelete array from bottom to top to delete each row by its number.
Uploading the CSV to ChatGPT’s Code Interpreter, instead.
“How many links do I need??”
A simple approach to calculating monthly minimum link building needs for SEO, based on competitive landscape:
- Open top 5-10 competitors in ahrefs
- Calculate # of referring domains received within last 6-12 months for each
- Calculate average of total
- Divide by # of months
This is the average amount of new referring domains linking to your competitors each month, and the minimum you need to build monthly.
Adapted from A Business Intelligence Approach To Link Building
System for using low-quality competitors against themselves in order to rank in search:
- Identify lowest quality competitors by analyzing Google results page 2 to 10, for your biggest keywords.
- Note “garbage” results, i.e. poor design.
- Repeat until you find ~8 examples, with at least 1,000 indexed articles.
- This suggest that they are publishing a lot of content, but can only compete in the long-tail.
- Keyword gap tool, your domain vs. example domains.
- Sort by volume, and find keywords that make sense for your business.
- Create and publish missing keywords.
Code Interpreter + Search Console = ⬆️ CTR
- Export GSC URL performance data data (CTR, impressions, clicks).
- Map title tags to URLs.
- Upload to Code Interpreter.
- Prompt: Identify top performing URLs.
- Prompt: Identify lowest preforming URLs.
- Prompt: Suggest new title tag for low performing URLs based on top performing URLs.