JavaScript & SEO

“Because of the lag between crawling and indexing on JavaScript websites, by the time the crawler finally gets around to crawling the URLs of deeper pages that have been discovered by the indexer, the crawl scheduler wants to go back to crawling the already known pages because their PR has already been calculated by the PageRanker, so they have a higher URL importance than the newly discovered URLs.”1

Not going to be worry over this one, just yet. It feels wiser to simply absorb useful information from the industry and wait for The Powers That Be to make their move.

Maybe Google will simply flip the switch, and concerns over JavaScript rendering will be another thing of the past. But if this Twitter thread and these computational costs of doing so are any indication, I am not holding my breath.

Maybe wide-spread adaption of isomorphic JavaScript applications? Maybe “Dynamic Rendering”?

https://twitter.com/suzukik/status/1045693011403919360

(November 2018 Update: Chrome Dev Summit 2018, Making Modern Web Content Discoverable for Search was a good presentation.)

“The end result is a very low rate of indexing on the site. Googlebot does its best, but its own URL scheduling systems don’t allow it to spend crawl effort on deeper URLs that it doesn’t see as having any value.”1

In the meantime, I appreciate the following tools for diagnosing any issues:

  • Chrome Web Developers plugin with JavaScript disabled
  • Screaming Frog text-only crawls
  • Google’s mobile friendliness tool and manual searches for crucial elements
  • Diff Checker for source vs. rendered code differences

Nov. 2018 Update: Chrome Dev Summit 2018, Making Modern Web Content Discoverable for Search was a good presentation.

[1]https://searchengineland.com/technical-seo-makeup-250408