John Andrews is a Competitive Webmaster and Search Engine Optimization Consultant in Seattle, Washington. This is John Andrews blog on issues of interest to the SEO community and competitive webmasters. Want to know more?

johnon.com  Competitive Web & SEO
November 27th, 2009 by john andrews

Ignorance is Powerful

Ignorance is more powerful than knowledge. Don’t believe me? Look around you. Which force is shaping the world around YOU?

★★ Click to Share!    Digg this     Create a del.icio.us Bookmark     Add to Newsvine
November 25th, 2009 by john andrews

Pay No Attention to the Little Man Behind the Curtain…

Since it’s Thanksgiving Day and in my culture that means watching The Wizard of Oz, I thought I’d reference the wizard (the little man behind the curtain) while making a note of Google’s latest “innovation” — the breadcrumb URL replacement. Aaron over at SEOBook (a great source of free SEO tools by the way) notes that Google has strayed from its focus on relevance, and replaced user-friendly URLs with not-so-meaningful site breadcrumbs.

See some examples of Google breadcrumbs in the SERPs here.

I like that Mat Cutts responded by saying he’d pass along the word… suggesting that this innovation was not a quality factor, and perhaps had not received much internal debate at Google w/respect to the relevance argument. I think that makes sense, because I think this sort of Google innovation is anti-competitive, and part of a long term internal strategy. I told Aaron it was just another step towards eliminating the URL. I have long believed Google wants to eliminate domains and URLs for lots of reasons.

But right now on the eve of Thanksgiving, I want to remind everyone of the Great and Powerful Wizard that was actually a little man behind a curtain.

The breadcrumbs are the Wizardry..what Google wants everyone to watch. Something new! Another Google innovation! Look, helpful breadcrumbs! Naturally it is the professional SEO that remarks first how this innovation does not enhance the SERP, because properly optimized sites have helpful URLs. In the example Aaron provided, Google has replaced a very helpful, meaningful and rich URL with an much less relevant breadcrumb. But Google wants everyone to watch how those breadcrumbs help  average web sites, because Google doesn’t want you to look behind the curtain.

What’s behind the curtain? Google hates those who make a market around Google. They hate the companies that make tools that mine Google’s data. They hate the optimizers who scrape Google and they hate the rank checking tools that charge money to report on Google status and ranking. They hate domains that garner direct navigation traffic, because users can find shoes at shoes.com without asking Google where to buy shoes (and revealing that they are a consumer, tracked by numerous Google-owned marketing cookies, now poised to execute a commercial transaction).

This URL removal is an anti-competitive practice that seeks to hinder the efforts of companies that re-sell Google’s data, whether they be SEO research services or re-purposers (scrapers). It is the unique URL (and unique domain name) that enables everyone else to make money on the web. As long as every consumer has to go through Google to find a web page, Google has a chance to take a piece of the profits.

The urgency however comes from the competition.  As long as Google spends its efforts creating relevant collections of URLs and publishing them to the web for free, others will mine that resource and re-sell it into niche marketplaces for profits. And that harms Google in the long run (or so the thinking goes… you do need to think it through though).

Think about the businesses that have storeed Google results sets for years, and now offer research services. The businesses that offer SEO services, to help web sites rank higher than Google naturally ranks them. The reporting services… even the “let us manage your Local Business Center account” agencies are parasitic to Google’s business model of one website, once business owner, one Google customer.

This year Google renovated its SERPs to use 302 redirects through a Google redirector, instantly breaking many third party tools and disrupting all sorts of user-centric research tools. AJAX results sets also debuted, redefining the Google SERP as dynamic in a whole new way. And now, in this test, the URL is replaced in many of the results, swapped for breadcrumbs. It could have been anything, not just breadcrumbs. Anything but a direct URL, or a URL which could be parsed out.

Personally I like to see big successful companies that exert monopoloy-like power over public markets start putting their energy into defensive tactics. It’s energy not put into real innovation. It takes a wee bit of pressure off the upstarts that hope to some day challenge the monopoly. Like my lacrosse coach used to scream at me every day, whenever you’re relaxing (not  working out), your competitor is getting bigger.

★★ Click to Share!    Digg this     Create a del.icio.us Bookmark     Add to Newsvine
November 5th, 2009 by john andrews

Google Closure.. will you register your code with the Borg?

“Think! Think! Think!”, said Pooh.

Today Google announced Google “Closure”, a set of tools for efficiently working with javascript. With Google Closure Google has theoretically “closed the loop” on a number of javascript problems. Not problems developers have with javascript, but problems Google has with javascript. The problems haven’t been solved, mind you,because we don’t all use Google Closure for everything yet, but conceptually, if we did, Google would have a much easier time as web overlord.

google-closure-seo

I think it’s important to change perspective for  a moment, from the Google PR and developer world perspective, to the competitive SEO perspective. I’m not saying any of this is true fact; I’m merely implying intent the same way Googlers often imply intent when looking at webmaster activity. I’ve seen Google employees look at a web site that appeared to be clean, and mark it untrustworthy simply because the webmaster appeared to associated with other sites that were not as clean. That’s the world Google forces us to live in, so Google should live in the same world. If Google expects us to be slimy, it’s a pretty safe bet that Google behaves that way as well.

So what’s this Closure stuff? Google released Closure as four related tools for working with javascript. First, a js compiler, which “compiles web apps down into compact, high-performance JavaScript code“. There is a Firefox tool, which helps you see into compiled code for debugging, since without that no developer would compile anything that wasn’t considered 100% finished. There is a “well-tested, modular, and cross-browser JavaScript library” called Closure Library,  and finally closure templates, which make working with the Closure Library easier if you are totally committed to The Google Way and comfortable building js apps on someone else’s framework.

So what’s the SEO perspective? Well, you should go back and re-consider why Google may have started hosting the most popular javascript libraries on the Google content distribution network last summer. I raised the issue then but didn’t highlight specific reasons why I was giving it so much attention. Google has long distrusted javascript. Since Google can’t actually crawl and interpret all of the javascript that may be modifying published web content on the Internet, js provides clever webmasters with a means of resisting the Borg. But if we all pulled our standard jQuery and MooTools and prototype js libraries off of Google’s CDN, Google could “trust” our sites more than sites which hosted their own js libraries.There wouldn’t be any “funny business” if the libraries were known to be clean.

With Closure, Google is able to go a step further *if* we all adopt it. Javascript submitted to Closure for compilation could be “indexed” and assigned an id code on the web, so that from that point onward Google would be able to recognize (and trust?) that compiled code. Any change would necessitate a recompile (or, in other words, re-registering your javascript code with Google). Given Google’s development of the Chrome browser, Google could also offer additional incentives for code registration — it could run faster in Chrome. Or it could be pulled from the CDN like jQuery, and your project might benefit from a kick-start if you use the Closure Library and Closure Templates.

Again… I’m not saying this is Google’s intent with Google Closure. I am saying that if I were Google, I would certainly explore this as an opportunity to advance my control the web without stifling innovation as much as I would otherwise have to… such as Google has been doing lately.

With Closure, Google closes the javascript loop, with what is basically registration of js code with the Borg. Like it or not, whether Google does it today or not, it is a viable option for Google, and certainly easier than trying to license white hat SEOs.

.

.
Related questions that Google should probably answer if it wants support from developers:

  • is this intended to compete with jQuery? Compliment? Or are devs expcted to pack jQuery through Closure, too?
  • why built another js minifier? Dean Edward‘s Packer works very well… even better than Closure according to early reports.
  • what about Google Web Toolkit js library? What’s the roadmap here.. or is there a js roadmap at all?

.

.

★★ Click to Share!    Digg this     Create a del.icio.us Bookmark     Add to Newsvine

Competitive Webmaster

Wonder how to be more competitive at some aspect of the web? Submit your thoughts.

SEO Secret

Not Post Secret

Click HERE



about


John Andrews is a mobile web professional and competitive search engine optimzer (SEO). He's been quietly earning top rank for websites since 1997. About John

navigation

blogroll

categories

comments policy

archives

credits

Recent Posts: ★ Do you want to WIN, or just “Be the Winner”? ★ 503: GONE ★ Cloud Storage ★ Identity Poetry for Marketers ★ PR is where the Money Is ★ Google is an Addict ★ When there are no Jobs ★ Google Stifles Innovation, starts Strangling Itself ★ Flying the SEO Helicopter ★ Penguin 2.0 Forewarning Propaganda? ★ Dedicated Class “C” IP addresses for SEO ★ New Domain Extensions (gTLDs) Could Change Everything ★ Kapost Review ★ Aaron Von Frankenstein ★ 2013 is The Year of the Proxy ★ Preparing for the Google Apocalypse ★ Rank #1 in Google for Your Name (for a fee) ★ Pseudo-Random Thoughts on Search ★ Twitter, Facebook, Google Plus, or a Blog ★ The BlueGlass Conference Opportunity ★ Google Execs Take a Break from Marissa Mayer, Lend Her to Yahoo! ★ Google SEO Guidelines ★ Reasons your Post-Penguin Link Building Sucks ★ Painful Example of Google’s Capricious Do Not Care Attitude ★ Seeing the Trees, but Missing the Forest 

Subscribe

☆ about

John Andrews is a mobile web professional and competitive search engine optimzer (SEO). He's been quietly earning top rank for websites since 1997. About John

☆ navigation

  • John Andrews and Competitive Webmastering
  • E-mail Contact Form
  • What does Creativity have to do with SEO?
  • How to Kill Someone Else’s AdSense Account: 10 Steps
  • Invitation to Twitter Followers
  • …unrelated: another good movie “Clean” with Maggie Cheung
  • …unrelated: My Hundred Dollar Mouse
  • Competitive Thinking
  • Free SEO for NYPHP PHP Talk Members
  • Smart People
  • Disclosure Statement
  • Google Sponsored SPAM
  • Blog Post ideas
  • X-Cart SEO: How to SEO the X Cart Shopping Cart
  • IncrediBill.blogspot.com
  • the nastiest bloke in seo
  • Seattle Domainers Conference
  • Import large file into MySQL : use SOURCE command
  • Vanetine’s Day Gift Ideas: Chocolate Fragrance!
  • SEM Rush Keyword Research
  • ☆ blogroll

  • Bellingham SEO
  • Domain Name Consultant
  • Hans Cave Diving in Mexico
  • Healthcare Search Marketing
  • John Andrews
  • John Andrews SEO
  • SEMPDX Interview
  • SEO Quiz
  • SEO Trophy Phrases
  • SMX Search Marketing Expo
  • T.R.A.F.F.I.C. East 2007
  • TOR
  • ☆ categories

    Competition (39)
    Competitive Intelligence (15)
    Competitive Webmastering (546)
    Webmasters to Watch (4)
    domainers (63)
    Oprah (1)
    photography (3)
    Privacy (16)
    Public Relations (187)
    SEO (397)
    Client vs. SEO (2)
    Link Building (3)
    Search Engines vs. SEO (1)
    SEO SECRETS (11)
    SEO vs. SEO (1)
    ThreadWatch Watching (5)
    Silliness (24)
    Social Media (7)
    society (31)
    Uncategorized (23)

    ☆ archives

  • September 2014
  • December 2013
  • October 2013
  • September 2013
  • August 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • November 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • November 2011
  • October 2011
  • September 2011
  • July 2011
  • May 2011
  • April 2011
  • March 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • July 2010
  • June 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • January 2010
  • December 2009
  • November 2009
  • October 2009
  • September 2009
  • August 2009
  • July 2009
  • June 2009
  • May 2009
  • April 2009
  • March 2009
  • February 2009
  • January 2009
  • December 2008
  • November 2008
  • October 2008
  • September 2008
  • August 2008
  • July 2008
  • June 2008
  • May 2008
  • April 2008
  • March 2008
  • February 2008
  • January 2008
  • December 2007
  • November 2007
  • October 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007
  • February 2007
  • January 2007
  • December 2006
  • November 2006
  • October 2006
  • September 2006
  • August 2006
  • July 2006