Skip to content

Google is not an Algorithm: Google Imparts Intent

Back in November the big news from Pubcon was that Google imparts intent when evaluating a web site for “quality”. Matt Cutts had accepted an invitation to participate in an open review of web sites, and had done a whois lookup on a publisher as part of that process. While examining the web site presented for review, Matt commented on “other domains” that publisher held, as if it were somehow germaine to the discussion of the particular domain under examination. The webmaster stated his “other domains” were unrelated, and not even linked to the domain under review. Matt acted as if that was irrelevant. Several SEOs in the audience were offended. What did other unconnected domains have to do with the one, published domain under review?

Matt didn’t clarify, but it was obvious to me and others that Matt was seeking to impart intent upon the publisher. The “other domains” were considered clues… additional information about the character of the web publisher. Google was using more than evidence to establish a basis for trust of a domain. Google was judging webmasters based on circumstantial observations, with no opportunity for clarification nor feedback from the publisher. It was “big news”, but it wasn’t widely recognized as such. I wanted to write about it, but decided not to. I thought it needed more time to gel… and I also considered it valuable information worthy of experimentation.

I have held for years in my SEO practice that a publisher dealing with Google needs to ensure a certain degree of plausible deniability when optimizing for Google. My SEO Secrets parody was inspired by the one-sided “righteousness” of Google and the effect it has on the SEO world. Matt’s comments were my first confirmation that Matt and his anti-spam team were actively judging webmaster character, and Matt’s comments in November suggested they were willing to use possibly disconnected circumstantial evidence as a basis for some of that judgement.

Today I am at SMX Seattle. I arrived late and missed Matt’s morning session, but accordng to the transcript at SERoundtable, Matt Cutt’s once again revealed Google’s imparting of intent. According to my read of the transcript, Matt was directly questioned by the audience on this issue. He said yes, the existence of additional domains with matching registration information is directly relevant. He gave an example of “spammy” domains suggesting a webmaster’s spammy character. He defended Google’s consideration of that “evidence” when evaluating a web site. I hadn’t yet read the transcripts when I met Matt this afternoon, but I will try and ask him to comment on this when I see him next.

The take-away is that all available evidence is ripe for picking by Google, and can and will be used (against you?) when evaluating an optimized website for “quality”. Those who still deny that Google is a competitor can take note, although I may hope along with them that Google will continue recent efforts at increased transparency when dealing with web site quality.

Google is not just an algorithm. Humans at Google now impart intent upon web masters, with judgements based on observations that include what is commonly known as circumstantial evidence. Honestly I am ok with this, because I believe it is a good way to improve quality. I have serious reservations, however, about subjective judgements hidden behind closed doors, as they are now for many web masters. I can think of many situations where circumstances might suggest a spammy situation erroneously and unfairly. I’d hate to see one of my sites penalized unfairly, with no opportunity to correct misconceptions that led to a penalty or site ban.

For optimizers and the business owners hiring them, it is more important than ever to understand the mindset of the Google quality people, such as Matt Cutts. The more competitive you are, the more likely Google humans will be involved in the process of ranking your web pages. The quality of the human doing your SEO will become more and more important as you become more competitive.

John Andrews is an SEO consultant and Competitive Webmaster out of Seattle.


  1. Halfdeck wrote:

    Nice post John.

    “I have serious reservations, however, about subjective judgements hidden behind closed doors”

    After dealing with several Google AdWords reviewers/reps, who often come to opposite conclusions about identical landing pages, I have similar reservations. In fact, I trust Google’s algorithm far more than I do Googlers.

    Tuesday, June 5, 2007 at 8:19 am | Permalink
  2. I’ve only just found your blog, John, via Peter da Vanzo and you serve up some really thought-provoking stuff. I guess there are some worrying aspects about this, and like Halfdeck I would be happy to just take what the algorithm gives me (wait a minute isn’t that ASK now.) However it’s an imperfect world and no search engine will do it exactly as I would like them to do it.

    In such an imperfect world, I’m happy to use mostly Google even if they’re also trying to figure out my intent as well as that of the web page producer and making the best match.

    Thursday, June 14, 2007 at 2:51 am | Permalink
  3. I am almost 100% sure that they are not only the algorithm. And to be honest I am happy about that. The only thing is that they need to be aware that being humans they are subject to making mistakes, and as long as they realize and discuss about their mistakes they will be a fair player. Such a great control over the internet is a benefit, a gift, a grand prize and a result of their smart decisions in the past. If you ask me google guys deserve it. However it is a responsibility. And I am sure they know that.

    Monday, July 7, 2008 at 2:40 am | Permalink
  4. komik wrote:

    In such an imperfect world, I’m happy to use mostly Google even if they’re also trying to figure out my intent as well as that of the web page producer and making the best match.

    Friday, July 25, 2008 at 12:55 am | Permalink