The Early Glory Days
In the current/old scheme of things, being a well-known brand alone would take care of your reach/appeal and thus the information creator was king.
· The only info broker was the ISP, who connects you the information.
· People know portals, sitesand visit them; end of story.
We could call it guilt by association, but it worked.
Slave To Evolution
From its early days, the web has grown from being spread over just a few pages or websites.
· Be it weblogs, news sites or online communities, there is too much information out there clamoring for the surfers attention.
· Finding the information, making it relevant is where the money is. Enter Google News, Topix.Net, Findory etc.
· The future belongs not to the information creator, but to the one who organises, contextualises and delivers the information over what we can call as information gateways.
People who control these information gateways (Google news, Yahoo News, Topix.Net) hold the key to our relevance on the web.
They do it by:
I want to read on Bhopal gas tragedy and I search first on Google.
Case a: People come to XYZ Newsite from the Google results page if we are ranked well.
Case b: People come to XYZ Newsite from the news search result on the Google search results page.
Case c: People come to XYZ Newsite from the ads on the news search results page.
Case d: A popular weblog or aggregation point (Metafilter, Slashdot etc) links to one such story from XYZ Newsite; more people, even those who do not know about us visit us.
Days of guilt by association are long gone; guilt by omnipresence is now the in thing.
Missing Evolution's Cluetrain
If we cannot be present in all the above stated cases, the following is what could happen:
· Lose out on being discovered by audiences other than our traditional ones.
· Risk becoming irrelevant because as we are absent at aggregation points.
· Lose out on being contextual to other, unrelated content.
As a result of the evolution, the web as an audience has split into two major groups, who come looking for:
1) Generic content (commonly available, everyone has it type, primarily news oriented content) that would be driven by the aggregation portals.
2) Specialised content (content associated with a particular organization for its singular nature: ex: NYT, Time Magazine Features) that wont be found elsewhere, driving audiences directly to the websites.
Syndication Soup For The Soul
RSS and other forms of XML are increasingly becoming the chosen mode for data mark up aimed at syndication, content reuse and content repurposing.
Alarmed by the ways in which data thus provided was being transformed, the Digital Web Magazine recently asked, "Should we be concerned that aggregators are increasingly allowing users to find their own ways to use our content how they see fit?"
In a strange way, it is both a threat and an opportunity at the same time.
· With more people moving towards information aggregators and other means of accessing data, page views are bound to drop sooner or later.
· This would make the reliance on advertisements from these pages a dicey proposition for the future.
· This opportunity should be used to explore other means of advertising or for the occupation of spaces where the advertisers would be headed for.
· Content creation and distribution should be reorganized to reflect the dichotomy mentioned earlier.(less, but systematic and streamlined, effort should go into generic content, while more effort should go into the specialised one).
We cannot know for sure which direction things would head for in the future.
It would be stupid, keeping that in mind, to either not move at all or to put all the eggs in just one basket.
In the least, any online entity should have at least a minimum presence in all the information gateways, if it is to stand any chance in the future.