Looking at the history
In order to look forward, it always pays to look back. The dominant standards for the web today are undeniably HTML (or its variances like XHTML) and HTTP. More recently, XML has emerged and, increasingly, RSS is becoming the dominant type of XML for sharing a variety of data.
How did each of those standards become a standard. It is obvious now (hindsight is always 20/20) that standards bodies have relatively little bearing when it comes to influencing the succes of a format. Take, for example, SGML, which was the dominant standardized format for document formatting. It was quickly superceded by HTML which, at the time, was not considered a standard.
The same is true of RSS and other standards for syndication. Formats like ICE, CDF, and NewsML were touted as the future when they were first introduced. However, they’ve recently been superceded by RSS.
And even within the RSS world, formats like RSS 1.0, which was supposed to be more semantically sound, and ATOM, which was supposed to be more forward thinking that RSS 2.0, have been losing the war to RSS 2.0.
Bootstrapping is a social phenomenon
What Dave Winer understood, when he sheperded RSS 2.0 into becoming the dominant mean of delivering syndicated content is that the life and death of a new format is predicated on its widespread adoption. And, in order to increase adoption, one has to make something generic, easy to understand, and simple.
Many of the people in the early days of the syndication space failed to see it as Dave did. We believed that a semantically sound format was better and we were wrong. Purity, it turns out is not always a good thing, especially if it gets in the way of people implementing something.
The same is true of HTML. I’d venture that, from a development standpoint, the biggest boost to HTML was a single menu feature that appeared in early browsers and remains there to this day: view source. In the early days of the web, countless developers learned how to do cool things with HTML by reading the source of pages designed by other people.
In a recent issue of ambidextrous magazine, Jeffrey Schox talks about the three stages of technological development: appropriation, early innovation, and sustainable innovation. Here’s how he describes the appropriation stage:
an issued pattent allows innovators to construct roadblocks behind them as they travel down a particular technological path… During the appropriation stage, patent roadblocks waste time and money… The countries, needing to catch up with the designs and technologies of other countries, should focus on collecting revenue and knowledge streams to fuel later stages of technology development.
While he focuses on hardware and electronics in a globalized marketplace, the same truth can be applied to standards. With few barriers in adopting a new standard and by fostering a culture of appropriation, one can easily establish a base of people who understand a new format. As more people understand it, they start implementing it and, after eventually getting smarter about it, start building on the efforts of previous creators. Eventually, those masses of tinkerers get to a critical point, pushing the new format into areas that were unexpected. Some companies eventually get smart to it and see growth in that area, which triggers them into experimenting with that new format.
Eventually, due to a general agreement among all developers, the format becomes a de facto standard. It does not have to have the imprimatur of a standard body (except for some very late adopters or pockets where such imprint is considered important) and moves forward.
What is interesting is the next stage, the one where standard bodies see the area as hot and decide that they need to play in that field. A good example of that is the ATOM format, which has been enshrined into an IETF approved format, and to date has failed to stop the RSS 2.0 juggernaut.
So what happened?
The amazing thing is how simple the issue is. The reason RSS 2.0 has been winning is that it has developed a following. With every new developer learning RSS 2.0, the format goes stronger and the same is true of every company implementing it. Because it is simple, it’s easy to pick up, which means that new developers can do interesting things with it relatively quickly, giving them a chance to become active members of the community and therefore become hooked on it.
The other issue is in keeping things relatively open, while still maintaining some level of control over the general direction. A successful future standard has to allow people a chance to contribute but, in the end, it also needs some gatekeepers who decide what goes in and what doesn’t. The same truth can be applied to any sofware development cycle: for example, Linux may be a widespread open source phenomenon but the number of people who decide what goes into the core kernel or doesn’t is still relatively limited. The same is true of any successful open source project: some level of centralized decision making and distribution of the work: anyone can contribute but not every contribution makes it into the final product.
I’m now seeing some of the same history repeat itself in the OPML space. It’s a format that is very simple and Dave is working very hard on getting people left and right to support it. It’s the same scenario he’s used to bootstrap the RSS format and to bootstrap concepts like blogging and podcasting into the mainstream. It’s a formula that works: keep it simple to implement, maintain some level of centralized control over the roadmap and then evangelize it left and right until it can no longer be stopped.