Tangential content involves producing content that isn’t directly related, but sits adjunct to your brand.
Use it to increase bottom line by: a) building brand awareness and trust to a wider audience b) creating more emotional content and better align with customer lifestyle c) creating a more diverse portfolio of content d) improving your business’ knowledge graph e) helping to better understand customer unmet needs, i.e. guide new product ideas or post-purchase informational articles
Step 1: Establish buyer persona
“To think about these tangential ideas, you need to put your mind into the mind of your customer.”
Use Similarweb for demographics.
Step 2: Create a mind map
“Splurge ideas [organically] based on customer profile.”
Step 3: Find the data to support your ideas
a) Search your keywords b) Export keywords data from search results
Although keyword research tools may show zero search volume for the keyword, the individual URLs will rank for additional keywords. This is the data being exported.
Step 4: Seek additional sources of inspiration
“To give a monthly search volume, keyword tools rely on historical data. Consequently, traditional keyword research methods are unlikely to surface emerging trends that don’t have historical data yet.”
“Dynamic forums like Reddit and Quora have a huge user base; all of whom are asking questions that many of them cannot find the answers to elsewhere online.”
a) Look for opportunities where keyword research tools have zero volume, but the keyword has good engagement within forums. b) When searching the keyword, look for examples that are primarily being addressed by forums within search results.
These are indicative of questions that are being frequently asked, but have not been picked up by keyword research tools.
Scrape all related ideas within the forum.
Use AI to make the questions less “chatty”, i.e. from colloquial into a more formal structure.
Pull related questions to your keyword (AlsoAsked).
Download search volume for all keywords in your dataset (Keywords Everywhere).
Combine CSVs / files.
Step 5: Cluster your keywords
“Clustering keywords helps us understand what pages to create and identify repeated questions in ‘zero volume’ keywords.”
Individual keywords may be zero volume, but when clustered paint a better picture in terms of its opportunity.
Understanding Google’s Helpful Content Update
If your traffic has been declining, it might be the result of the Helpful Content System assessing a large proportion of your content as generally not the most helpful result when it comes to meeting the searcher’s primary need.
This classifier is essentially saying, “In general, content from this site is not the most helpful option for searchers when compared to others.”
Each time I look at content Google’s algorithms started to prefer and compare it against the site that started to struggle, it is clearly more helpful at getting the searcher to their answer and meeting their need.
When it comes to navigation, [company] has implemented a clever approach by placing the entry point to the guide’s pillar page in the footer.
This strategy serves as an additional element that helps in discoverability, internal linking, and emphasizes the importance of the guide within the website.
Problem: Duplicated intent has led to keyword cannibalization.
Solution: Using software (Keyword Insights) to cluster keywords in order to identify which ones have the same intent.
Conduct keyword research to add to the initial data set (in case relevant keywords have been initially missed).
Update new data set to keyword clustering software.
Redirect keywords (and matching URL) that are now grouped together into a single, core URL (choosing the URL that ranks best or has the best metrics).
Freshness Distance – Pt. 1
For most, updating content once per year will do 99% of the work, both in preventing decay and acting on content starting to. It’s also easy to operationalize.”
If this number is larger than a year or close to it, just default to once a year. If it’s narrower – such as three months – you need to update more frequently. Set that timeline and operationally stick to it instead of trying to solve for content decay.
In SEO, we should be proactive not reactive to win.
Freshness Distance – Pt. 2
60-Second SEO: You can use Text-Diff Compare Checkers to analyze how competitors are changing their on-page content over time.
Reclaiming Historical 404s (and backlinks) with WebArchive.org CDX Server API
WebArchive.org’s CDX Server API can easily surface historical URLs that now resolve 404.
Using Screaming Frog to crawl these URLs and view the redirect chain, they can then be prioritized to fix sorted by number of referring domains pointing to them.
Crawl these URI in List Mode in Screaming Frog and export redirect chain.
Filter for Status Code: 404 and copy/paste this list into URL Profiler with ahrefs API enabled.
I’ve seen this one tactic — fixing historical redirects — make the difference between being just another competitor out there to being a real player at the top of the market.
Bonus: regex for removing image files and non-HTML files from your list:
/.*\.(png|jpe?g|gif|bmp|css|js)$/i
A Take on Writing and Generative AI
End time included (37:12)
A Take on Writing and Generative AI, Pt. 2
…one thing I think is often true with AI generated content, is that in time it can become less helpful than other original content on the SERP which can lead to bounces off your page, and that is something Google grades highly in ranking.
I think that having topic suggestions inside your editor is a really powerful feature that enables you to start writing straight away instead of having to go to the SERP for the keyword and reading tons of other articles just to get up to speed yourself.
The ability to generate content when your own brain is stumped is really good, and the output is decent, but you should probably always apply your own manual edits.
The key pitfall of using AI is when you get lazy with it.
If you use AI in an intelligent way, like as an actual assistant rather than a pure source, then you’re more likely to produce better content that can continue to rank.
A Take on Unique Content vs. “Content Grading Tools”
Strategies:
Unique perspectives (What does our client think about the topic? What can we find on forums like Reddit, Quora, or Twitter?)
SME Quotes (Who can we build relationships with who can provide recurring quotes? What do our clients have to say about a topic – e.g. How to x…? Who can we quote without needing to reach out to them because they’ve already discussed the topic publicly?)
Personality
Explanation by analogy
Storytelling (Customer stories or stories already shared online)
Data (1st party or 3rd party)
Charts and graphs (Datawrapper, etc)
Custom illustrations, manipulated photos, or AI photos
Custom development
Tools (Calculator, survey where results are revealed, etc)
“Common complaints” — touch on what the opposing view is saying about the topic.
Bonus:
Key takeaways at the beginning of the article, for tl;dr visitors.
Placing the “Data (1st party or 3rd party)” at the beginning of the article.
(in order to promote passive link building — readers will see the citable data above-the-fold, lending to “zero-time-to-value”.
Qualifications for Choosing Content Grading Tools (and why)
Allow you to pick which page 1 sites to use for comparison.
Have the ability to compare content both BEFORE and AFTER publication.
“SEO is a game of match and exceed.”
Google has been serving results based on intent buckets for at least 3.5 years now.
This means that they’ll return a mix of intents for most queries to increase/better their chance of getting the click (hyper-specific long-tail queries where the searchers’ intent is obvious don’t fit this scenario).
Most pages can be slotted into one of three intents:
Info/Review E-commerce News
Whichever page classification you have, you should only compare yourself to the similar pages on page 1.
You might outrank an eCommerce result with your info/review page, but you won’t replace it.
…
To rank page 1, we have to be top 3 for our bucket. But it also means that to once we get the top 30, there are only ten sites we’re competing with for page 1 real estate.
This Google Apps Script is intended to delete rows from the currently active sheet in a Google Spreadsheet if any cell in the row contains one of the specified keywords. The keywords are contained in three different arrays (states, keywords, and cities), which are then combined into one allKeywords array.
The function first retrieves all data from the active sheet, and then iterates through each cell of each row to check if it contains any of the keywords. If a keyword is found, the row number is added to the rowsToDelete array. Finally, the script iterates through the rowsToDelete array from bottom to top to delete each row by its number.
Update 7/7/2023:
Uploading the CSV to ChatGPT’s Code Interpreter, instead.
“How many links do I need??”
A simple approach to calculating monthly minimum link building needs for SEO, based on competitive landscape:
Open top 5-10 competitors in ahrefs
Calculate # of referring domains received within last 6-12 months for each
Calculate average of total
Divide by # of months
This is the average amount of new referring domains linking to your competitors each month, and the minimum you need to build monthly.