jump to navigation

In search of the content holy grail June 14, 2007

Posted by Bernard Lunn in B2B Media.
trackback

I live in New York, run an outsourcing business based in India focussed on B2B Media and have spent most of my career bringing new software products to market. So I was intrigued to hear the CEO of Reed Business Information describe the two different content models. In New York you assemble a team with three editors and one programmer. In Silicon Valley your team is three programmers and one editor. As a service company we are agnostic. We can assemble a team of researchers to manually aggregate, structure and re-purpose existing or we can build software to do the same automatically. It all depends on the requirements.

Automated data aggregation is inexpensive and scalable but usually not quite authoritative enough. Even the best vertical search engines or community created content is usually only the starting point for research. Truly authoritative research (by which I mean it is good enough to make decisions involving real money) almost always involves human labor.

The holy grail is the top right of a magic quadrant with “automated” on one axis and “authoritative” on the other axis. I hope you have seen enough magic quadrants, as my drawing skills are weak. The bottom left – neither authoritative nor automated is clearly a waste of time and money. Oops, sorry Blogosphere :-). The authoritative but manual is where most traditional publishing and market research lives and despite what Silicon Valley might think will survive in some form forever. The automated but not authoritative will continue to get better and better, fueled by new investment and the inherent aggregation power of the Net but most are still in the category of “interesting and sort of useful” but not quite authoritative enough for prime time.

I tried to think of examples that were in the magic top right quadrant, both automated and authoritative and the best example I could come up with pre-dates the Internet and that I was involved with over 20 years ago. Reuters started with human editors around the world and that is still an important part of their business, but it is their Foreign Exchange pricing service that stands out as a great example. When I first heard about it in the mid-1980’s it was astounding to see a model where the data input was all free and yet people would pay a lot of money every month to get the aggregated data. It was not until EBay came along that I saw such a pure money-making machine based on the power of network economics.

The Internet makes aggregation much easier, so easy in fact that building barriers to entry becomes the issue. The automated/authoritative model involves everybody who matters in the market contributing their data, typically for free. The contributors are way more savvy about the value that they bring and any start-up that tries to build too big a toll booth will miss the network effect that comes from aggregating everybody and then others will enter the market and it will become fragmented and then it is no longer valuable.

To pull off another Reuters like automated/authoritative play, three things have to come together fast enough that other start-ups don’t get in quickly enough to ruin the game:

  1. The source contributors have to be authoritative. That worked in the case of Reuters as only Forex traders could contribute and by definition every transaction was valuable data that made the market.
  2. Enough contributors have to join to make it comprehensive and they have to contribute regularly.
  3. The output has to become the single data point that is trusted in the market. No other research is needed. If Reuters says that one pound is worth $1.96 then that is the market fact.

This can only work in very simple taxonomies with limited need for data depth; that is the key to being authoritative. A market price is a totally simple taxonomy. Even something as simple as People Data (where there is a lot of automated aggregation going on) is way more complex and the depth of data that is possible and needed is almost limitless, so no single service can become the authoritative source other than in a very specific niche.

I see tons of niche opportunities that work on a mix of automation and manual research, where the niche is small enough to enable a company to become authoritative. These typically will fall below the radar screen of VC funded technology start-ups. I don’t see a large scale automated/authoritative opportunity. Of course if I did I would be building it and not blogging about it!

Advertisements

Comments»

No comments yet — be the first.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: