Whatever happened to …?

In November 1987, the first Hypertext conference was held at the University of North Carolina at Chapel Hill, with Nelson and Engelbart each, for the first time in their lives, gaining recognition and acclaim from an audience which did not need to be persuaded of the worth of their work. Although there were whispers that a deal was in the wind, that conference was a little too early for any announcement that software giant Autodesk would commit millions to the development of Xanadu as it did in April 1988.

To make a long story short, an office was set up at 550 California Avenue, Palo Alto with a development team headed by Roger Gregory who I first met at Hypertext ‘87 and later caught up with at a party hosted by Keith Henson17 for the Bay Area Science Fiction Club, and at the offices of the Xanadu Operating Company (XOC). But nothing really changed. Despite the wishes of many involved that a ‘Xanadu Light’ product with limited functionality be released as quickly as possible, the management decision was to get Xanadu’s rich hypertext model perfect before any commitment would be made to product roll out. After spending almost $US9 million on the development, in March 1992 Autodesk finally called a halt as part of a corporation wide retreat to core business. Last year Keith Henson was able to buy the rights to all of XOC’s work for $US5000 and he and Roger Gregory are now back in an office at 550 California Avenue trying to knock the code into a shape that will be finally useful for something. They are working in conjunction with a rapidly growing 1994 software startup called Filoli which is developing information systems for U.S. insurance major Industrial Indemnity. As of early January 1995, Keith and Roger were not certain as to whether the Xanadu software would ever get to play its envisaged role in the Filoli project, but they were still trying to find and fix the last bugs. Despite having been amongst the early Internet users, they had not yet managed to personally experience the Web, although they have started to talk about a Xanadu server being compatible with Web client software.

Meanwhile, in a press release dated 23 September 1994, Ted Nelson informed his fans that he was off to Japan for a year:

Ted Nelson Studios and the Sapporo Electronic Center jointly announced … the forming of the Sapporo HyperLab, a new design center for electronic media for the Internet.

The HyperLab will implement join projects of Theodor Holm Nelson, inventor of hypertext and founder of Project Xanadu, and Professor Yuzuru Tanaka, Professor of Electrical Engineering at the University of Hokkaido, to provide new software and media to be distributed free on the Internet.

In other parts of Nelson’s electronically published “News Letter, Number Three” (where that press release also appears) Nelson described his post-Autodesk views, having regained control of the Xanadu name and keeping an interest in the continuing work of XOC. He had come to see Xanadu as a business model for electronic publishing copyright, more than as a technological solution, but he still had a range of technical suggestions for further development towards the Xanadu ideal of “bidirectional transclusions”. Describing “Component Xanadu” and “Xanadu Light”, he showed that he has certainly taken on board some of the changed circumstances due to the success of the Web, while retaining a keen wish that earlier work on Xanadu would still prove to have a value over and above its inspiration to those who followed:

The previous designs have assumed that Xanadu would be a universal system for managing change and global linkage. But now we live in a much more anarchic world of many clever hackers. This means we have to think in terms of building Xanadu out of functioning parts.

THE DIFFERENT FUNCTIONS OF XANADU CAN BE ACCOMPLISHED BY THE DIFFERENT PIECES. But tying them together is a substantial protocol issue.

However, let’s not count out the major work done by the guys at XOC. It may yet work, and it’s still way at the state of the art. …

Xanadu Light is a very simple design. The hard part is the client program you need as a browser. For real transclusion management you need to be able to keep track of pieces and layers ten times more complicated than the code that Mosaic deals with. Perhaps Tumblers will get used after all.

As always, we expect that others will develop the client programs. (Indeed, that’s what happened with World Wide Web.) [his parentheses]

While Xanadu was the technical focus of the global hypertext vision for two decades, at Brown University the first implementation of a “Hypertext Editing System” by Andries van Dam in 1967 made Providence, Rhode Island, the preeminent centre for academic exploration of the potentials of hypertext through that same long period of incubation.

Brown University has played a major role in the design and development of both hypertext systems and materials ever since Ted Nelson and Andries van Dam worked together here in the 1960s.


Those efforts became focussed on Brown’s Institute for Research in Information and Scholarship (IRIS) between 1983 and 1994 and particularly on the development and exploitation of IRIS’s Intermedia software between 1987 and 1992.

Intermedia, which employs object-oriented programming, uses a client-server model with a Unix-based system of permissions and user groups that allows linking to otherwise read-only documents. Multiple users can therefore read the same document while someone is editing it; users can simultaneously edit different parts (documents) of web.18… InterDraw documents take the form either of scanned images or ones created using Intermedia’s structured graphics editor. Intermedia also includes InterVal, a time line editor, and Interword, a styled-based text editor. One of Intermedia’s crucial features is the Web View, an advanced navigation tool that essentially removes most possibilities of disorientation, or getting lost in hyperspace.


Initially, Intermedia was used to support coursework at Brown, firstly for a Literature course given by George Landow, followed by Biology and many other disciplines. By the time of my last visit there in 1989, the emphasis of IRIS had move strongly towards licensing Intermedia to other users, particularly other universities where they were getting some initial positive responses, although I formed that the opinion that they were not nearly ready for the kind of commercial representation in Australia that was by then the focus of PICA Pty Ltd’s business.

Norman Meyrowitz and Nicole Yankelovich published numerous papers on aspects of the technical development which they led, while Landow published even more extensively on the scholarly implications of utilising Intermedia. In Hypertext, Landow expounds the convergence of the critical theory of Derrida and Barthes with information technology—providing a most comprehensive expansion of a point which I had made independently with respect to Lyotard’s Post-Modern in a 1992 paper.

Landow concludes his chapter “Reconfiguring the Author” (Hypertext, esp. pp 99-100) by recounting the involvement of a team of more than 12 people who contributed in some significant way to one particular Intermedia web. He has found hypermedia to require a team production, far removed from the traditional concepts of authorship, and very much in line with current practices of CD-ROM-based multimedia production. That point raises a significant question for Nelson’s continuing preoccupation with Xanadu’s mechanisms for extending traditional copyright notions into the new media, and is entirely consistent with my colleague Ian Webster’s observation that the early content of the World Wide Web is overwhelmingly self-promoting, and thus not in need of any special mechanisms for intellectual property protection.

In response to points I recently raised with Landow by e-mail, he expressed some reservations with respect to my plan to use Intermedia in this kind of comparison to the success of the Web:

I don’t think Intermedia failed at all or in any way: It was intended from the beginning as a research system that would demonstrate the validity of many of the ideas of Nelson, van Dam, Yankelovich, Meyrowitz, et al. And it did, and other systems and programmers have picked up many of the ideas there embodied.

However, in his 1994 “Afterword to the Catalan Translation” of Hypertext, which he generously sent me by accompanying e-mail, Landow puts a rather different point:

In 1990 Apple Computers, Inc. effectively put an end to the Intermedia project, part of which they had funded, by so altering A/UX, their version of UNIX, that all development of the program ceased, and when Apple made new models of the Macintosh fundamentally incompatible with the earlier version of A/UX, it became clear that no one could use Intermedia even in a research situation. Intermedia ran essentially without maintenance for two years—itself an astonishing tribute to software of such complexity—while we sought another system.

This leaves me in no doubt that at least the objectives of the commercial arm of Brown with respect to Intermedia clearly failed. Although Intermedia was built on the kind of technical foundations (Unix, client-server, object oriented) that should have left the option open, there appears to have been no point at which the question of the global interconnection of webs was seriously addressed, so Landow’s point with regard to its success in terms of its original objectives remains valid. What it did prove beyond doubt was the risk of hitching a pioneering software development to the whims of a single computer hardware vendor, even one as visionary as Apple.

A lack of serious planning for ‘scalability’ to global coverage and a counterproductive willingness to commit to the Macintosh user interface were two characteristics that Interpedia shared with my attempts to develop a Public Information Communications and Access (PICA) System—although our team had no doubt that we were developing a world wide service. In the mid to late 1980s, the seductive ‘user friendliness’ of the Mac proved quite a deterrent to keeping on top of what eventually proved to be equally important developments in the then very cryptic world of the Internet.

Immediately after I finished my work on The Australian Beginning at the end of 1981, I started work on ideas for a PICA System, work which continued with some support from a few colleagues until mid-1984, when a sudden insight refocused our energies. Through much of that period I had been a writer for the trade press, especially Australian Micro Computerworld magazine. In that capacity I was given early access to Apple’s Macintosh and gained a rapid appreciation of its revolutionary method of interacting with users. On suddenly seeing the possibility of developing a new class of information service predicated on the Macintosh user interface, our project became ‘The’ PICA System, attracting many hours of (largely unfunded) systems design and documentation work from a core group of supporters and the active involvement of a highly regarded investment broker.

We had identified the potential importance of usability to wider public acceptance of information services more than eight years before a similar consideration inspired Marc Andreessen to develop Mosaic. During those intervening years the scepticism of Macintosh unbelievers was finally watered down by the eventual capitulation of Microsoft to Windows, and of the Unix community to a graphical user interface standard cryptically known as X. So even if we had been able to deliver a working PICA System on target around 1987, we would certainly have had a solution in search of a problem. Such a prospect had been clearly recognised in our business plan, which made considerable allowance for marketing costs. Neither CERN nor NCSA needed to make any provision for initial marketing of the Web or Mosaic. The extensive reach and very nature of the Internet in 1994 provided both the climate of user requirements and the communication channels to reach them, each of which could only have been built at considerable cost in the late 1980s.

The PICA System specification went beyond even Intermedia’s ‘web view’ and use of graphical inclusions to provide for multiple interchangeable views of several types of information—extending the work of a colleague who had included a number of ‘directly manipulable’ views in his project management software for the Macintosh. We incorporated a small group of widely used data formats analogously to the central role of HTML in the Web, and likewise our supported formats were designed to be open ended. The PICA System also included a model for controlling access permissions that was based on our practical experience of the publishing industry. This model was not quite as esoteric as Xanadu’s transclusions and tumblers, but was more flexible that the singular approaches to this issue by both Intermedia and the Web. The development of the PICA System was eventually put onto backburner mode by the growth of PICA Pty Ltd’s desktop publishing–related business interests. Sometime after that, we were introduced to and had the chance to become familiar with the Xanadu and Intermedia projects, and it was clear immediately that we all shared a common vision.

With the wisdom of hindsight, it also became clear that the technical support for development projects of this kind was very immature and fragile in the mid-1980s. There is little doubt that it may well have taken a lot more than the couple of million dollars of venture capital that we had initially sought to successfully develop and market the PICA System at that time. However the awareness that was born of the work that we did get done served to open many doors. PICA Pty Ltd became the Australian distributor for the world’s first widely distributed hypertext software—Office Workstations Limited’s ‘Guide’. In those days, prerelease ‘beta’ testing was carefully controlled in total contrast to the nowadays commonplace general release of early beta and even alpha versions of Web-related software on the Net. I became a beta tester for Apple’s landmark HyperCard and was able to be at the 1987 Boston Macworld Expo for its public launch. HyperCard became the ‘killer’ application for Macintosh hypertext, but not in the usual sense. HyperCard was unquestionably successful in terms of the market penetration it achieved and the uses to which it was put. However, unlike ‘killer’ spreadsheets and page layout programs which catalysed new industries, HyperCard simply killed off alternative projects for the development of Macintosh based hypertext, before finally metamorphising itself into a multimedia development platform. One measure of that effect was the emergence of hypertext-based help as the only integral feature of Microsoft Windows that has ever compared favourably to the equivalent facility on Macintosh.

Just as Xerox lost the plot of the visionary work at PARC and allowed its commercialisation by Apple and others; those of us who were seduced by the usability of the Macintosh and/or the vision of a global hypertext also lost touch with the rapid maturing of the Internet until well into the 1990s. Xanadu and Intermedia (and Hypercard and even the PICA System in its own small way) have become part of the culture which made the world ready for the rapid uptake of the Web. However, the Web does not fully implement the specified functionality of any of those projects, and none of the actual software defined and developed by those ‘pretenders’ has been taken up as part of the Web itself.19 Meanwhile many of the humans who shared parts of the journey have fallen by the wayside, either burnt out or frozen into some niche they found along the way, but for a few like this writer or Samuel Latt Epstein (who we will soon meet), the Net, hypertext and usability have reconverged into a long anticipated platform from which we might push the vision a little further.