Skip to content

Jason Calcanis: From Now On, Links are Everything

Second title of this post is : Read My Lips: Content is King

Third Title of this post is : Google Can’t Read: Rich Navigation and the Return of Link Power

With the growth of asynchronous web technologies like what is commonly called AJAX, comes a new form of “rich navigation”. Rich Navigation is navigation supplemented with ancillary information that can get quite voluminous. With AJAX, a designer can offer a simple list style navigation which produces a full page (or more) of supporting data for each listed item, before an actual click through to the item’s “page”. The URL doesn’t change, but the view is updated with a significant amount of information related to that one, perhaps 2-word navigation item.

Take a look at the Microsoft home page ( as an example. A search engine will see the page as it appears prior to any asynchronous data loading. The navigation on the right side (big box that says “All Microsoft Sites”) includes an item “products & related technologies”. That is what the SE will see for the navigation item. But if you click, you get an AJAX-loaded pop-over with a whole “page” full of additional information supporting “products & related technologies”, which the search engine didn’t see (search engines don’t activate AJAX clicks). The URL didn’t change, but content was added to the view including “Windows”, “Office Online”, “Servers”, “Games & XBox”, and more.

Now this is just an example, and it looks like Microsoft hasn’t even filled it out yet because it is very redundant right now, but as an example I think it shows the problem – the search engines will miss most of the content of this “view” because the search engines are still following a URL page model. Worse yet in my opinion, is what will happen as a consequence.

Basically, as the web goes to asynchronous views, Google becomes illiterate. The page is about Windows and XBox and Servers, but all Google reads is “products & technologies”. In SEO world, an early optimization would have been to use “XBox and Games” as a navigation link, so the Google knows the link follows through to information (on the site) about XBox and Games. A site map would also be considered for its ability to do the same – define the content semantically using internal hyperlinks, even if they were not included in the user interface.

Now with AJAX used for this rich navigation, that becomes nearly impossible. The amount of cloaking that would be needed is much, much more…. beyond what would normally be considered acceptable by the Google guidelines, and into the realm of “second, HTML-based site for screen readers”. Not much better than Flash, eh? Not a pretty consequence of the “advances” of asynchronous views, and most certainly not one acceptable to the designers advancing the use of AJAX for user interfaces.

And that is why I believe Google will revert to an increased weighting on back links, once again. The more illiterate Google becomes with respect to semantic content, the more Google must rely on other factors for determining relevance. The primary “other factor” available to Google is linkage. Google can’t afford to update the Webmaster Guidelines to permit more cloaking. Google can’t afford to mis-index the majority of the most current, cutting edge content on the web (at least not fo rmuch longer).

Are you ready for the next “Google Update” which will lower the power of adjacent, semantic content and increase the power of inbound links? Given the influence of age and age-related factors on link trust, you should have already started. The old “buy on rumor, sell on news” proves true once again; almost everyone is now talking about how back links don’t matter, and content is king.

Post Script: I love the irony and the play… some dude stands up in front of an SEO audience and tells them they are full of sh*t, and that content is king, and then walks the room with his hat collecting back links. Priceless.



  1. I disagree. This is just one more reason why Google is moving to user data to determine at least popularity.

    John adds: Jeremy, consider this:

    • Google can only know a very small percentage of user activity, and that activity is skewed (toolbar users, account holders, dirty cookie data)
    • Popularity is not the same as relevance.
    • In the short term, if content goes behind AJAX, what tools does Google have in hand to help it determine relevance?

    Given the rich integration of linkage in “the algorithm” already, I can’t get past the idea that linkage will be the patch that carries Google to the next level, as long as that takes.

    Friday, December 22, 2006 at 12:54 am | Permalink
  2. Jeremy Luebke wrote:

    What makes you say they only have a very small percentage of user activity? What don’t already have, they can buy from places like Akamai. Add that info to the toolbar, analytics, user accounts, search activity, and so on, and they have way more than enough data to create a data set to profile just about any site on the web with at least a little information. They have already proven that they do not care about the small fry websites with almost no traffic (sandbox).

    In the end though, it is link gaming that is leading them to user data integration into the serps and not issues like Ajax. Ajax is really no worry to Google. It’s going to have the same fate as Flash. Everyone is starting to overuse it and will until they realize they are missing out on SE traffic. Users also cannot bookmark exact content. There will then be a movement to scale it back and only use it as a compliment to a site, and not the site itself.

    Google has proven they control which direction the web development world takes and not the other way around. Google controls the traffic, and traffic is what business owners want which in the end will take precedence over a fancy interface and what developers want.

    Saturday, December 23, 2006 at 11:01 am | Permalink