The postmodern web: Mark Liberman blogs on a critique of the Semantic Web by Malcolm Hyman and Jürgen Renn at the NSF/JISC Repositories Workshop. Mark makes several good and entertaining points, but he left one for me to make. He quotes the following from Hyman and Renn's text:
Web 2.0 is the protestant vision of the Semantic Web: where central authorities have failed in mediating between the real world of data and the transcendental world of meaning, smaller, self-organized groups feel that they are called upon to open direct access to this trancendental world in terms of their own interpretations of the Great Hypertext. The traditional separation between providers/priests and clients/laymen is thus modified in favor of a new social network in which meaning is actually created bottom up. The unrealistic idea of taxonomies inaugurated by top-down meausres is being replaced by the more feasible enterprise of "folksonomies" spread by special interest groups. As their scope remains, however, rather limited and the separation between data and metadata essentially unchallenged, the chances for developing such a social network into a knowledge network fulling [sic] coping with the real world of data are slim.
He then quotes several bullets from their talk outline, including the following:
Moving from servers and browsers to interagents that allow people to interact with information
Replacing browsing and searching with projecting and federating information
Enabling automated federation through an extensible service architecture
Extending current hypertext architecture with granular addressing and enriched links
These goals contradict Hyman and Renn's bottom-up fantasy. Extensible service architectures, federated information, and fine-grained links have all pretty much failed because they just import into software design the challenges of agreeing on a common vocabulary and its computational bindings among multiple parties with differing interests and no central coordination. Successful large-scale open-source projects require a BDFL like Linus Torvalds or Guido van Rossum who is respected and accepted by the great majority as a gatekeeper for requirements, design, implementation standards, and testing.
The problem is that we have no idea of how to reproduce in the computational realm the socially and cognitively self-orgainizing evolution of human language. Linguistic semantics is grounded human perception, action, and social interaction. Current computational methods are too brittle, so they require centralized design if they are to work at all. Coarse-grained modularity (operating system, applications, ...) allows some degree of distributed development, but agreement on interfaces is a major bottleneck. For these reasons, shallow, robust methods operating on mostly flat text — search as we know it — are more effective than supposedly more discriminating methods operating on allegedly richer representations. This may change as we develop more robust methods for data integration, but Hyman and Renn's "protestant vision of the Semantic Web" is at present just a cute analogy without substance.