Skip to content

We’re All SEO Tools

SEO is Dead: Long Live Competitive Webmastering

My friends at OutspokenMedia just published an outline of their SEO audit process. It looks exactly like what should be in a webmastering 101 course. From my perspective, it has nothing to do with search engine optimization, except that search engines control the flow of almost all of the Internet’s traffic these days. That these SEO audit points are actually just good webmastering is an important distinction to make, because every time we treat SEO as a unique entity, we grant Google more authority over the Internet. Why do we do keep doing that?

Every time we consider something “improper” because it doesn’t help or perhaps hinders search, we support the search engines in their manipulation of the Internet. This blog of mine is for experienced SEO people, so I don’t explain everything in great detail for the lay person, but it is very clear that Google stifles innovation on the Internet so that it can control traffic flows and the profits associated with that traffic. It is very clear that Google acts as a censor for public dissemination of freely-published materials, and it is becoming more clear that the popularity of SEO is helping Google (and the other search engines) further their control and censorship at all of our expense.

Honestly, did you ever really believe that a commercial entity claiming to want to “organize all of the world’s information” would be unbiased, altruistic and benevolent?

If you feel like throwing up after reading that Wall Street investment banking firm Goldman Sachs is now giving out $20 billion in bonuses (averaging $700,000 per employee this year — Google it), you should really enjoy a look at Google’s profits and the economic impact Google’s manipulation of the web has had on our economy. Unfortunately, you’ll have to wait a few more years for the economists to put all those numbers together. If you ask me, the smartest thing other countries have done and can do going forward, is block Google entry into their markets until those analyses are available.

In the mean time, let’s try and understand how SEO is adding to the problem. Here is the Outspoken Media list, with my annotations showing why these issues should be discretionary for web publishers, were they not being manipulated by Google:

Duplicate content: I am free to publish the same paragraph twice, or the same article twice. I might do that to reach my audience, who may be reading my web site via different paths. But Google uses an algorithm to penalize duplicate content. They say it’s not a “penalty” but from a publisher/media perspective, it is a penalty.

If I publish my editorial on the home page of my site, and again on the Editorial Page, Google will decide which one to index and offer in the search results, based not on my publishing objectives, but Google’s own algorithm.

The solution recommended by SEO experts? A complicated re-review of my publishing model, looking not only at my visitors and their attention/reading habits, but whether factors like incoming links from other sights will cause Google to remove my home page from the search index. This is abhorrent business behavior by Google, and one of many examples of how Google is an aggressive, profit-driven abuser of the free and open Internet, and unworthy of our support. Yet, we support them and even go so far as to say that the proper way to publish is to defer to Google’s desires.

Redirect issues: A redirect is a technical solution to a common problem: the desired information has been moved to another URL. Done properly, the user looking for Article X gets forwarded to where Article X can be found. The SEO audit looks for “redirect issues”, including technically incorrect implementations. That’s good webmastering, not SEO. But the SEO also looks to see that 301 or 302 redirects are used, according to Google’s guidelines on the use of redirects. Google claims some redirects that are proper by webmastering standards are “wrong” and the Google guidelines state that not following those guidelines can get you penalized (even if you follow proper HTML standards). Once again, we see Google setting a new standard in its favor, without compromising through participation in the democratic standards setting processes. Good web citizen? Hardly. Yet, we seem to support it.

Indexing/crawl issues: Good webmastering should not prevent spidering or crawling of web content by search engines. Any “indexing issues” are entirely based in the search engine and its approach to the web. If you are not indexed and want to be, then yes, you need to deal with the search engines doing the indexing. In the long run, this pure search engine issue is really a government and societal issue, having to do with civil rights (equal access to information) and responsible government (upholding the public trust). Sadly, we are a ways off from addressing those broader concerns.

Improper categorization: The publisher decides how to categorize the published content. Good webmastering achieves the goals of the publisher. Only when we start to look at the specific ways that search engines index the content, do we start to consider some forms of categorization “improper” (or sub-optimal). Again, until society works through its dependencies on private for-profit companies like Google for universal information access, we suffer the whims of Google abuse and have to turn to specialists for help navigating those treacherous waters.

Crappy title tags: I’m not sure what “title tags” are, but the title element is a very important part of SEO and a very minor part of publishing. Each page has one title element, which is displayed in the blue bar at the top of most rendered screens (in browsers). Title elements offer little value to the publisher, because users ignore them. Google actually thinks title elements have no value at all to the reader, and dropped them completely from display in the Google browser rendering. However, Google has co-opted the HTML standard once again, and made title elements critically important for SEO.

Crappy meta descriptions: meta description tags are another part of HTML standard, and are invisible to the reader. Once again publishers need only pay attention to these if they are following Google or other search engine guidelines, for some specific reason (such as getting traffic from Google). Since they are part of the HTML specification, good webmastering should ensure these meta tags are present and proper, by publishing standards.

Usability problems: Usability is an art and science dedicated to users, not search engines. Not sure why this is in an SEO audit, but of course good competitive webmastering is concerned with usability. I know Google sometimes falls back on a general “good for users” usability argument when pushing involuntary standards onto webmasters, but they also violate that logic frequently. Publishers should make good websites for their readers…. and if they don’t their readers will let them know via the free market. With Google, that free market goes away because Google might decide not to send you any of that market traffic in the first place.

Conversion problems: just like usability… not an SEO issue, but of concern to publishers and competitive webmasters.

Keyword research/Keyword density: the entire field of keyword research developed around search engines and their private, unilaterally imposed policies. If you are catering to search engines, you need an SEO and keyword researcher. But there is no reason why publishers (masters of language and communication, at least if they survive in a free marketplace) need a third party to tell them what words to use in their documents.

Internal linking strategies, anchor text, and Sitemaps: This is another example where Google corrupts the publishing process for commercial gain, and is empowered to do so by the marketplace’s adoption of SEO and other Google-imposed bastardizations of content publishing. A publisher naturally adds navigation to its content to serve the users. Newspapers and books have had page numbers for that purpose for as long as I can remember.

Poor navigation technology or internal linking is a webmastering quality issue. Improper linking with respect to publishing goals is a business issue.

When Google starts to manipulate that process, in ways that hinder, stifle, or corrupt the free press, Google is not only interfering with market dynamics which are known to improve the marketplace over time, but artificially supporting market practices which can hurt a market over time. This is just one example of this very serious issue that few are acknowledging… those economic analyses we will receive in a few years will prove this.

When a publisher starts to play with XML sitemaps at the request of search engines, that extra work is injected into the system at a cost to the publisher but generates profit for the search engine. That, by very definition, decreases productivity, and can be considered one of many cost externalizations the search engines have achieved over the years while the SEO industry has supported their market manipulations.

External linking strategies and anchor text: External linking strategies only exist because of search engine’s corruption of the publishing process. Otherwise, external linking (and anchor text) is a free market innovation, outside of the purview of the content publisher (it is others, external, who make those links to your content). If you want to start to understand how Google stifles innovation, and recognize how Google is one of the worst web citizens the web has ever encountered, start with linking.

Site architecture and URLs: Again, this is a good webmastering issue and success should align with the achievement of publishing goals. To the extent that SEO intervention is needed beyond good webmastering or alignment of publishing with business goals, that intervention is necessary only because of search engine imposed restrictions. More stifling of innovation. Society needs to address the value of that, just as they need to re-evaluate the value of the current Federal Trade Commission (FTC) here in the states.

Nofollow, disallow, noindex: these are directives for use with robots.txt and certain meta tags, and are part of Google webmastering, again serving business goals. To the extent that intervention is needed by an SEO, it is solely because search engines have corrupted the normal standards process. If doing it properly doesn’t achieve results in the marketplace because of specific search engine imposed restrictions, then what other choice is available but stepping in line and changing to accommodate the search engines? Stifling it is… as established.

Social media indicators: Another thing that belongs on the publishing side and not an SEO audit list, Social Media involves webmastering when the business goals are supported by a publishing strategy. This has nothing to do with SEO, until search engines corrupt the free market dynamics and establish guidelines for how society should behave. Which they have.

You might recognize this as Google’s clearest intrusion into society and behavior to date, but my perspective suggests the destruction of linking was more damaging. Also the sharing of aggregate user data with other entities is disgusting. Social Media on an SEO audit list? What has this world come to… and perhaps more importantly, why are we lining up and supporting it?

The answer, I’m afraid involves politics, something technical people have never really liked and often choose to ignore. As a result of our (in)actions, we get the web we have and the web that’s coming, instead of the web we all imaging could be when it all started, and probably still dream might be as each new, cool, innovation comes onto our radar screens. Radar screens which, by the way, are controlled by Google.

What can you do?

Don’t use Chrome. Wait for Firefox to catch up, or use Opera (it’s wicked fast).

Don’t use Google Analytics. It’s priceless as a business tool for Google, so if you don’t use it for some reason, Google will be forced to pay attention to the reasons.

Don’t use Google. Use or or rely on friend’s recommendations. If you gave Bing every other search you currently do on Google, Bing would have 50% market share. Yes, that’s the power you have.

Chat with your local political representative. Just because you approach him, he’ll have to pay more attention to you. That’s how it works. Let him know you are afraid of Google having too much power, and are thinking it’s time to start paying attention before it’s too late.

Block Google analytics spying on your system. You can do it easily via a few means. When you do this, every site you visit that uses GA will not report your visit back to Google. It’s a simple step that can go a very long way towards improving things on the web.

I recommend you ask your Social Media circles for advice on “how can I block Google analytics? Thanks“. And once you do it, pass it along with a tweet “here’s how you block Google analytics spying on you -link“. Go ahead. Help yourself, and help others.


  1. I’m a competitive webmaster, not a SEO? I can live with that!

    I feel remiss in my duties as a competitive webmaster, because I do focus almost entirely on optimizing sites for Google. It’s like a drug. We’re reliant on them because of the market share and convincing people that they’re going to survive (especially in this economy) without Google is no small task.

    I’ve been in the trenches lately, maybe it’s a good weekend to sit back and re-evaluate the why.

    Friday, July 3, 2009 at 5:04 pm | Permalink
  2. Alan Bleiweiss wrote:

    Are you kidding? [blah blah blah] … I’d love to see if you have the balls to post my comment.

    @Alan courage I have. Stupidity? Not so much. Say something useful and meaningful and I’m happy to post it. And as for that title tag thing, it’s really not that important either way… just an old rant that may not be relevant any more.

    Friday, July 3, 2009 at 5:44 pm | Permalink
  3. john andrews wrote:

    @Rhea you’re definitely an SEO, having graduated from Competitive Webmaster and dedicated yourself to helping less-competitive webmasters. But it’s sad that the client web people don’t think they need to improve, but rather think they need SEO. And that is part of the market manipulation.

    Friday, July 3, 2009 at 10:29 pm | Permalink
  4. deleted wrote:

    Regarding duplicate content and seo….if I am catering for the US market then want to branch out to the UK, do I need to write new content or just allow them to switch to the other version with localised ads?

    Saturday, July 4, 2009 at 1:26 am | Permalink
  5. bobthebuilder wrote:

    Here’s an opt out for a bit of Google Tracking.

    Take your seats in the theatre.

    Saturday, July 4, 2009 at 5:12 am | Permalink
  6. Dan Brown wrote:

    Hi John,

    What can I say….we are on the same page…Very well written…and hopefully “The Truth will Set Us ALL free”….sooner than later.

    You have the #1 blog in this multi-billion dollar industry.

    I enjoy all your post…very much…because you are honest and straight to the point.


    Sunday, July 5, 2009 at 9:53 am | Permalink
  7. jim wrote:

    …individual freedom is possible only to those who are strong enough, psychologically and morally, to withdraw their sanction from any system that coercively thrives off their productive energies.” (Sciabarra — “The Russian Radical”, pp. 301-302)

    Sunday, July 5, 2009 at 8:02 pm | Permalink
  8. Xavier Paz wrote:

    Nice point, but greatly exaggerated. Clearly, Google is not “the good guy”, but neither are Microsoft or Yahoo or anyone else. And the rules set by Google are not set for world domination, but to prevent abuse, which is much worse to legitimate web publishers than following those rules.

    @Xavier The “to prevent abuse” excuse is long past… it serves Google well, but we should not accept it any longer. Google makes billions and pays out a small fraction to webmasters, not in proportion to their contributions. For the market to survive, the rewards must flow to those taking the risk and putting in the productivity (at least to a sustainable level). Let Google work hard to maintain quality, and thus succeed in a competition amongst search engines. 

    Monday, July 6, 2009 at 12:33 am | Permalink
  9. A refreshing critique of SEO … well, “best practices” or “dogma,” depending on your vantage point.

    I disagree that “Title elements offer little value to the publisher, because users ignore them.” The content of a element appears in numerous places that have nothing to do with search engines, but are quite important to users – in user browser bookmarks, and ubiquitously as hyperlinks to the resource (RSS feeds, links to blog posts, etc.). As often as not, a linked has little or no context, so a element that offers a succinct summary of the referenced resource is going to mean more to users than one which is vague, cute or in some other way non descriptive. This is also the case for description attributes when they’re published as a summary accompanying a hyperlinked .

    External linking strategies, nofollow, etc.? Yeah, the SEO community’s subservience to the big G is nowhere better illustrated than in the slavish acceptance of Google’s strictures on linking issues. I’m ready to digitally bop the next sanctimonious SEO that proudly proclaims that they’re “white hat,” and disparages any deviation from Google’s Webmaster guidelines as some sort of deviancy. Thanks, but Google doesn’t sign my paycheques people.

    @AAron: nice perspective on title elements used elsewhere… that does show a good deal of value  to publishers, but those examples were innovations first driven by marketplace dynamics (e.g. RSS as push). However, much of that has also been corrupted by search engines already. I am convinced we would have seen a lot more innovation (and improvement) if the search engines had not set out rules with such powerful threats for non-compliance. When did you last see a Competitive Webmastering list include writing titles for RSS, or conversion, instead of for search engines?

    Monday, July 6, 2009 at 6:33 am | Permalink
  10. Dan Brown wrote:


    I found this article quite revealing about search engine users:

    Just to important to ignore:


    While we can only speculate as to what any of these data really suggest long-term, Bing attracted over 6 million new US consumers to Microsoft search products to experience it for themselves, compared to the prior week’s traffic.

    *** This part relates to Google and its users…

    The propensity to be interested in (and perhaps aware of) an opportunity to check out a new search experience was far higher among Google searchers than among other searchers. ***

    Now the question is whether the Bing experience has translated interest into actual change in search behaviors. We’ll continue tracking this closely. End Quote.

    Article Link:


    Monday, July 6, 2009 at 1:17 pm | Permalink
  11. Ron Chmara wrote:

    Interesting read, with only one typo that caught my eye (a sight is not a site), but you seem to be placing the blame for SEO cargo-cult thinking upon the planes, or the military, rather than upon the heads of the many witch doctors who profess that they have special and magical ways to bring the planes back.

    Oh, and of course, they’re willing to sell you their ideas, give their sage advice, and spend all kinds of time and effort making themselves important in the magical rituals of… bringing the planes back.

    Monday, July 6, 2009 at 2:06 pm | Permalink
  12. Geek Heaven wrote:

    Amen brotha. I’m tired of Google telling me how I *should* run my site.

    Wednesday, July 8, 2009 at 10:57 am | Permalink
  13. Martin wrote:

    I agree with you 100%, though I won’t stop using Google Tools :-)

    It’s incredible how Google changes the rules of the Internet for its benefit and almost no one questions it. I was having this same conversation with a friend the other day, thinking how many innovations we are missing just because we are “forced” to comply to Google’s guidelines.

    Thursday, July 9, 2009 at 4:24 am | Permalink
  14. Ian M wrote:

    One particularly good tool for blocking tracking by Google Analytics et al when visiting other sites is to use an ad-blocker plugin with an anti-tracking block list.

    The “EasyPrivacy” block list for AdBlock Plus (for Firefox) is pretty good – all the lists are here

    Wednesday, August 5, 2009 at 4:07 am | Permalink