The mantra “Content is King” has been bandied around for years now: it was certainly huge when I started this blog in 2007.
As a “content person” I’ve always wanted this to be true, and acted as if it was. In 2009, I was running what we would now call “Content Marketing” for a big travel client using a network of authors with their own profiles, long before “Authorship” came to the fore.
Truth is, however, that between then and now content, while important, has still played second- or third-fiddle to all sorts of other factors in SEO, especially in-bound links and other technical tweaks. There have always been ways to artificially boost a site’s authority because — until recently — search engine algorithms have been pretty simple beasts, and it’s been very difficult to turn content “effectiveness” into a meaningful metric.
Google’s Hummingbird update was probably the beginning of the end for technical fiddles. It’s wrong to call Hummingbird an “update”, it was much more than that. We’d all been talking about “Semantic Search” for years, but Hummingbird meant that semantics — the basis of real language — really would play a part in search quality.
King Tut’s Tomb
The reasoning behind quality content being a ranking factor is simple. People go to search engines to find answers to their questions, and if they get unsatisfactory answers they don’t come back. So search engines need to find the sites giving those best answers and feature them highly. That doesn’t necessarily mean the sites that everyone knows about — in other words, the ones with the most inbound links.
Any researcher will know the joy of finding something that has never been seen before, or which has been lost. In real life, you might liken this to the discovery of Tutankhamen’s Tomb. All the wonderful treasures were there to be found because all trace of the tomb had been lost for centuries.
The search engines need to do two things to ensure they show the best search results:
- They need to find it
- They need to assess the content just like the end user and ascribe a value to it
The problem with the first issue is that in-bound links have become incredibly devalued as a result of years of “Black Hat” SEO techniques, so some other way to validate existing ranking factors is required. The move towards the “entity” module is one, with simple mentions of your URL on another site is seen an an endorsement, or — as in the case of a poor review — exactly the opposite. And then there’s Social Media: not a ranking factor of itself as Google’s John Mueller repeatedly points out, but certainly a flag that something is making waves and should be investigated.
All this means is that keeping your house in order with good technical SEO, Meta tagging, Schema and great server response times is still important.
Say it like it is
But this is a blog about content and it is point 2 that is important to us. Once upon a time, and not so long ago, all you needed to do with content was make sure you had your chosen keywords in the right ratios and that was it: no attention to quality or readability or relevance was really required. Today, Semantic Search means you don’t need to worry about Keyword Densities, you only need to write authoritatively about your subject in a natural way. Of all the search engines, Google has perhaps spent most time — and money — trying to get content assessment right, even lashing out big bucks on buying large chunks of time on mega computers owned by the US department of Defense.
Today, as a result of all this effort, Google can now read as well as any 10-year-old: a 10-year-old that can speak 123 languages and name 170 million references in less than a second.
That means that for the first time in forever, what you write on your site is as important as anything else, for if a search engine comes via a tweet or Google+ posting or even an inbound-link, unless it finds something useful, unique and usable there, it will shrug its subset-evaluation functions and move on.
Google have gone public about this. In February 2015, New Scientist magazine published an article called “Google wants to rank websites based on facts not links” saying preferential treatment would be given to websites which carried more truthful information. Basically, liars won’t prosper.
As an SEO I increasingly examine my chosen specialty with despair. It was never easy, but it is becoming more and more difficult to try to see what more code-updates, responsive layout tweaks or time-to-first byte improvements you can do to boost a site’s rankings. Hopefully, there is something now which works … content.
Leave a Reply