Three Generations of SEO (Part II)

This is part II in a series of posts on SEO-driven businesses. Part I is here.

I see three generations of SEO optimization in the wild that build upon each other. Below are their distinguishing characteristics.

  • SEO 1.0: human-driven SEO. This is primarily about people tweaking content and linking structure. The main distinguishing characteristic of an SEO 1.0 company is that there are people on staff who spend a portion of their time optimizing content for SEO. The best example is probably About.com, with its 750+ experts and dozens of editors. The experts and editors choose which topics to write on and know how to write to drive rank. They know how to format titles and what words they should avoid and what’s the “right” length of an article, etc. They also know how to build a strong link internal structure, albeit mostly by hand. Our own TechTarget and THCN are pretty expert at SEO 1.0. For Internet marketing, HubSpot is a great resource which helps many companies get on the SEO 1.0 train. The biggest problem of SEO 1.0 is scale–it takes humans to optimize content.
  • SEO 1.5: adding UGC for scale. What you may not be able to do with a dozen or a few hundred experts you may be able to do with lots of contributors. Blogging sites are the main examples. They key here is to get to scale and avoid “poisonous” SEO–the type of content that actually can drive your rank down.
  • SEO 2.0: machine-driven SEO. If SEO 1.0 is about the human touch, SEO 2.0 is about software crunching through content databases. What SEO 2.0 lacks in human skill it gains in scale. The key is having good meta-data. WordPress.com is a good example. Although the content is UGC, nobody is teaching WP bloggers how to write for SEO. However, the smart folks at Automattic are taking the meta-data from blog posts, including but not limited to titles, tags, linking and visit histories, and using them to generate both new pages (for example, using the SEO tag) and new links in content (as in the auto-generated possibly related posts at the bottom of blog posts). E-commerce sites are great candidates for SEO 2.0 because products carry a lot of meta-data allowing new pages to be built by brand, price and other features. Many also do SEO 1.0 through on-staff experts/marketers writing about the products and SEO 1.5 by adding (or syndicating) reviews. Another good example is EveryZing, run by my friend Tom Wilde, which can add meta-data to media assets (podcasts, videos) and make them search-friendly by doing speech-to-text. Yet more examples are what BitPipe (now owned by TechTarget) or Scribd are doing.
  • SEO 2.5: content enhancement. I’m noticing an increasing effort, particularly with UGC, to use software tools to enhance the quality of content  that is produced as opposed to try to add value around it (such as through additional reading links, etc.). The idea is simple: the better the content and the more relevant links in it, the more interesting it is for the search engines and potentially for visitors. One player in this space is Zemanta (which Fred Wilson recently invested in), whose widget I’m staring at now as I write this post. I like what these guys are doing. At Plinky, we are taking a very different approach with the same goal–help people create better UGC.
  • SEO 3.0: it’s all about back links. If you have a lot of content with good meta-data you can create an amazing internal linking structure within the sites you control. But you need to have back links to drive rank. Domain squatters for years have built link farms across the domains they own and have tried to hide the coordinated nature of their linking from search engines. Then came the comment and review spammers who essentially poisoned much of that pool of UGC for search. On the legitimate side, businesses have struck content partnerships that increase back links for years. But it wasn’t until the last couple of years that many companies approached the problem of building legitimate, value-added back links with scale and automation in mind. We first saw this through site templates (MySpace or WordPress, for example) and widgets and the back links to their creators. I am now seeing a next generation of startups innovating in this area. The delicate balance is to add tons of back links that add value to visitors as opposed to look like a link farm.

The majority of SEO experts and consultants built their fame doing SEO 1.0. Some have experience with SEO 1.5 and can suggest manual widget distribution strategies to help with SEO 3.0. These people tend to be focused much more on content than on software. In my own experience and in talking to heads of marketing and CTOs at Polaris portfolio companies I get the sense that very few SEO gurus have any experience with building the foundational software systems that drive SEO 2.0-3.0. This is a big problem. It is not easy to build an SEO-optimized content management + publishing system driven by a large content database with largely automated editorial processes. If you want to take your business to scale, you have to think about your SEO strategy and architect your service for this from the ground up. This is not something you can bolt on. Also, unless you already have an SEO-knowledgeable architect on your team, expect to spend a long time finding one and consider building an SEO advisory board to augment the knowledge/experience of the person you’ll end up with.

The next part in the series will look at the economics of “free” traffic.

About Simeon Simeonov

Entrepreneur. Investor. Trusted advisor.
This entry was posted in Digital Media, startups, Web 2.0 and tagged , , , , , , , , , , , , , , . Bookmark the permalink.

7 Responses to Three Generations of SEO (Part II)

  1. Robby Monk says:

    GREAT article. I’m trying to learn more and more about SEO and this definitely helps. I’m looking forward to the next article!
    http://www.robbymonk.com

  2. Good read – a much needed injection of logic and structure into what we all knew (or, rather, felt). Better classification than other attempts I’ve seen. Although the ‘halves’ are not as ‘high contrast’ – incremental evolution is about fuzzy overlaps – the main 3 generations are distinct models, albeit not mutually exclusive. I advise clients to practice a balanced mix of all three, to the extent of their capabiity/maturity.

    As a business strategist more than technologist nowadays, I would like to see similar logic around the ‘Input’ socket of the SEO process: why we optimise, what traffic we want to drive (to where) and what we intend to do with such traffic? It is obvious that different intent (mass e-commerce v/s opinion-influencing socio-political blog, v/s niche b2b offering etc) would call for different approaches, models and generations of SEO (or their priority in the mix). IMHO today’s understanding of the reasons for SEO is rather simplistic and narrow, even among successful practitioners of all 3 generations.

    Do the SEO ‘gurus’ (or their automated Frankenstein optimisers) ever ask the business leaders any of the above questions, or assume an ‘obvious’ answer like ‘..because you want to sell more, don’t you?..’

    Just some more FFT but keenly looking forward to Part III, this is a seminal piece – thanks!

  3. Vlad, yes, the input question is key and tied to what you is the ROI on SEO, especially at the margin. That’s what the third part is going to be about. No such thing as free traffic…

  4. Jon Zerden says:

    Sim,

    As you can imagine – I am quite interested in SEO 2.5/3.0… that is the automated metadata / link creation.

    You mentioned that this can’t / shouldn’t be a bolt-on to an existing publishing system. I am not sure I totally follow… On an enterprise system, couldn’t I (relatively) simply add a step to the publishing process that sent (via web service or similar) my content and structured metadata (I’d still want to manage that internally) to a third party provider – then the provider could mark it up with links and provide me back some nicely formatted XML that could be parsed and added back into the CMS before moving to the next step in the process?

    In larger organizations (non-UGC) – I am thinking that I would still need a way to sanitize the return links / metadata – as this automated process would still be susceptible to age old problem of: out of context, inappropriate and potential damaging content/links.

    While the above process would defiantly help me with SEO, It wouldn’t really help me with on-site search and/or content reporting. I’d still need a way to manage the structured portion of the metadata (which we use for on-site search, automated outbound feeds and for reporting on which types of content Is popular) – which still would require manual intervention and is prone to error…

    Jon

  5. Jon, in theory, anything is possible in software, right? My comment about bolt-ons has more to do with their practicality as opposed to the theoretical possibility of them adding value to an existing system. If you have legacy code, you should certainly look for ways to take advantage of higher levels of SEO. You should expect some pain, though. I know of no off-the-shelf Web CMS that can do (or even has the hooks for) SEO 2.0-3.0. The e-commerce platforms are the closest (used to working with a lot of meta-data) but the actual SEOability of pages is left as a manual “to do” for implementers.

    Yes, you can bolt on all kinds of enhancements to existing systems and, in the case of outsourced/hosted services as your example above suggests, you can have these be fairly sophisticated. You can undoubtedly get some value this way but you’ll create friction at the edges. Your comment already identified one potential issue: added workflow to validate externally-provided content enhancements. Also, you correctly point out that the the added meta-data won’t help you necessarily with on-site search unless you make the processed/enhanced content what your search engine indexes. Last but not least, an outsourced provider may have a hard time helping you with back links in a legitimate and truly value-added way as opposed to something that search engines may consider an attempt to beat the system.

  6. Tony Confrey says:

    Thanks for the article. SEO is something I definitely need to know more about.

    I’m thinking there’s maybe another dimension that lines up with your generations. Thats the evolution from Web 1.0 to 2.0 to (maybe) 3.0.

    With 1.0 content is static and indexing is pretty simple. With the RIAs of web 2.0 it seems to me that the whole problem gets much more difficult, both for the Search Engine and the Optimizer. In my own case for example most of the action takes place inside a Flash player on the page with data hidden away. This makes machine annotation of the surrounding HTML much more important. Have you any thoughts on this?

    As to the Semantic web, maybe again there’s a parallel to SEO 3.0. When everything is rdf, the linking *is* the data!

    Tony

    PS As an aside (or maybe a separate point), there’s also the advent of widget proliferation. With my content in an embedded widget pointing back to my site, do I get a leg up on back link generation?

  7. Kostikes, I’m Bulgarian and my Russian is rusty.

    I’m not sure what’s the problem with my RSS feed link–it seems like a character set problem.

    As for your second comment, my theme is mentioned on the bottom of every page–it’s Regulus by Binary Moon.

Leave a Reply