Why The Web: The Answers

During our winding and richly interconnected journey, I have endeavoured to bring forward many of the reasons for the success of the World Wide Web in their broad context and with reference to selected comparators. It has been in many ways a personal journey, but it also applies the perspectives of convergent technologies and of visioning the future. Before concluding with a brief overview of ongoing directions for our research, it is appropriate to bring together what we have found to be six key reasons for the success of the Web26 and to comment briefly on what those findings might say about our general understanding of technology and society.

First, the Web was not built from the ground up, but was built as a modest extra layer upon a vast collection of preestablished technologies. Most of those software and underlying hardware technologies were already part of the Internet, particularly TCP/IP, FTP, MIME and Gopher. SGML was a strong standard for electronic publishing, a vast quantity of potential Web content was already Net-accessible or otherwise in electronically publishable form, and a rich array of software development tools were able to be used in the development. Many of these factors either did not apply or were not considered by our comparators. I use a concept of ‘memetic distance’27 to help recognise that the step from one viable technology to the next must be manageably small.

Second, the Web was built by more efficient methods than those used in planned commercial developments. CERN’s work was justified for internal purposes and that at NCSA as a research prototype. The Internet Engineering Task Force’s method for promulgating and adopting proposed standards is the antithesis of the bureaucracy that afflicts other standard-setting processes. In the Internet community and in big science research establishments such as CERN and NCSA, open collaboration is the normal modus operandi, in contrast to the proprietary considerations of commercial software developers such as Xanadu and PICA. However, those open methods rarely produce commercial quality software, so plenty of space has now been opened up for the likes of Mosaic Communications.

Third, while each of Tim Berners-Lee’s three central standards plays an essential part in the success of the Web design, the difference between URLs and Xanadu’s ‘tumblers’ stands out. Even if cryptic, a URL provides a human-readable and printable way of addressing any information resource on the Internet. The first part of the URL identifies the format of the target information, the second part is the name of the host computer in exactly the same form as is used in e-mail addresses, and the final part is the path to the information in that computer’s file system. The link to anywhere in the Web can thus be advertised in print or in a posting to a newsgroup. For efficiency, Xanadu’s tumblers were only designed to be readable by computer and so would not be sensible in the print media.

Fourth, the climate for the Web has been established by a succession of visions, particularly those of Ted Nelson and Xerox PARC, and it continues to be inspired by William Gibson’s once entirely fictional notion of cyberspace. Over a generation, those visions have fostered an intellectual and technical climate which eventually made bridgeable the memetic distance from the likes of hypertext, WIMP and Gopher to the full richness of the Web and Mosaic. Those who have been touched by the visions were also more than ready to put the Web to work for their own purposes.

Fifth, Marc Andreessen actually built a user-friendly graphical interface to the Web, and then he built another one for the rest of us. While Nelson’s noted that leaving the user interface for somebody else to develop was exactly what Xanadu had specified all along, his problem is that nobody ever did it for Xanadu. Many people other than Andreessen have also found building a Web interface, or even a Web server, to be an achievable task, particularly now that a market for such software has been established.

Sixth, countless ‘newbies’ rapidly found their own uses for the Web. In continued the downward spiral in publishing costs, both tangible and intangible, from those of the traditional media, through desktop publishing, and even lower through the effective elimination of the cost of targeted distribution. The Web quickly attracted a critical mass of self-publishers unconcerned about royalties and generally unconcerned about copyright save for a few disclaimers disallowing others (particularly the imminent Microsoft network) from appropriating individuals’ work without appropriate recompense. The uses to which the Web is being put appear almost without limit.

The World Wide Web links a vast network of Latourian actors, human, non-human, material and ethereal. The six above-listed causes of the Web’s success dance with those actors across a profusion of interconnections. The ideas of human visionaries become memes propagating an epidemic of Web ‘surfing’. The Web’s computer codes become epidemic across the Internet. Loops in the Web’s links, and in its actor-network, feed back positively and cybernetically—fuelling its continued near exponential growth and its ever-accelerating transformation into cyberspace proper.

I am reminded of my retreat from the stillborn PICA System to the pre-Web Internet. Just about the first thing of interest I found on the Net was the Principia Cybernetica Project—Francis Heylighen et al’s attempt to develop a comprehensive exposition of the philosophy of cybernetics, systems theory and complexity. They were looking for a way of presenting the rich interconnections of their work on the Net. They found the Web. Early. Through PCP,28 I found that my theoretical interest in complex systems, general evolution and memetics; my department’s interest in Latourian actor-networks; and my own practical interests in the emergence of cyberspace; are all very amenable to study in terms of (second order) cybernetics.

Cyberculture follows the postmodern.