I see three generations of SEO optimization in the wild that build upon each other. Below are their distinguishing characteristics.
- SEO 1.0: human-driven SEO. This is primarily about people tweaking content and linking structure. The main distinguishing characteristic of an SEO 1.0 company is that there are people on staff who spend a portion of their time optimizing content for SEO. The best example is probably About.com, with its 750+ experts and dozens of editors. The experts and editors choose which topics to write on and know how to write to drive rank. They know how to format titles and what words they should avoid and what’s the “right” length of an article, etc. They also know how to build a strong link internal structure, albeit mostly by hand. Our own TechTarget and THCN are pretty expert at SEO 1.0. For Internet marketing, HubSpot is a great resource which helps many companies get on the SEO 1.0 train. The biggest problem of SEO 1.0 is scale–it takes humans to optimize content.
- SEO 1.5: adding UGC for scale. What you may not be able to do with a dozen or a few hundred experts you may be able to do with lots of contributors. Blogging sites are the main examples. They key here is to get to scale and avoid “poisonous” SEO–the type of content that actually can drive your rank down.
- SEO 2.0: machine-driven SEO. If SEO 1.0 is about the human touch, SEO 2.0 is about software crunching through content databases. What SEO 2.0 lacks in human skill it gains in scale. The key is having good meta-data. WordPress.com is a good example. Although the content is UGC, nobody is teaching WP bloggers how to write for SEO. However, the smart folks at Automattic are taking the meta-data from blog posts, including but not limited to titles, tags, linking and visit histories, and using them to generate both new pages (for example, using the SEO tag) and new links in content (as in the auto-generated possibly related posts at the bottom of blog posts). E-commerce sites are great candidates for SEO 2.0 because products carry a lot of meta-data allowing new pages to be built by brand, price and other features. Many also do SEO 1.0 through on-staff experts/marketers writing about the products and SEO 1.5 by adding (or syndicating) reviews. Another good example is EveryZing, run by my friend Tom Wilde, which can add meta-data to media assets (podcasts, videos) and make them search-friendly by doing speech-to-text. Yet more examples are what BitPipe (now owned by TechTarget) or Scribd are doing.
- SEO 2.5: content enhancement. I’m noticing an increasing effort, particularly with UGC, to use software tools to enhance the quality of content that is produced as opposed to try to add value around it (such as through additional reading links, etc.). The idea is simple: the better the content and the more relevant links in it, the more interesting it is for the search engines and potentially for visitors. One player in this space is Zemanta (which Fred Wilson recently invested in), whose widget I’m staring at now as I write this post. I like what these guys are doing. At Plinky, we are taking a very different approach with the same goal–help people create better UGC.
- SEO 3.0: it’s all about back links. If you have a lot of content with good meta-data you can create an amazing internal linking structure within the sites you control. But you need to have back links to drive rank. Domain squatters for years have built link farms across the domains they own and have tried to hide the coordinated nature of their linking from search engines. Then came the comment and review spammers who essentially poisoned much of that pool of UGC for search. On the legitimate side, businesses have struck content partnerships that increase back links for years. But it wasn’t until the last couple of years that many companies approached the problem of building legitimate, value-added back links with scale and automation in mind. We first saw this through site templates (MySpace or WordPress, for example) and widgets and the back links to their creators. I am now seeing a next generation of startups innovating in this area. The delicate balance is to add tons of back links that add value to visitors as opposed to look like a link farm.
The majority of SEO experts and consultants built their fame doing SEO 1.0. Some have experience with SEO 1.5 and can suggest manual widget distribution strategies to help with SEO 3.0. These people tend to be focused much more on content than on software. In my own experience and in talking to heads of marketing and CTOs at Polaris portfolio companies I get the sense that very few SEO gurus have any experience with building the foundational software systems that drive SEO 2.0-3.0. This is a big problem. It is not easy to build an SEO-optimized content management + publishing system driven by a large content database with largely automated editorial processes. If you want to take your business to scale, you have to think about your SEO strategy and architect your service for this from the ground up. This is not something you can bolt on. Also, unless you already have an SEO-knowledgeable architect on your team, expect to spend a long time finding one and consider building an SEO advisory board to augment the knowledge/experience of the person you’ll end up with.
The next part in the series will look at the economics of “free” traffic.