The earliest and one of the clearest articulations of the idea that information processing technology could be used to amplify human memory and thinking was the one Doug found that day in 1945, in an article entitled “As We May Think,” published toward the end of the war inThe Atlantic Monthly. The author was the highest-ranking scientific administrator in the U.S. war effort, Vannevar Bush.
Bush, the son and grandson of Yankee seafarers, was the same mathematician who had constructed analog computers at MIT in the 1930 s. He was also in charge of over 6000 U.S. scientists during World War II, as director of the Office of Research and Development. His two most important goals were starting the Manhattan project and finding a means to stop German bombing, goals that both directly hastened the invention of computing machinery. Ironically, Bush didn’t mention the potential of the early computers as information-handling devices when he wrote his article. But he did present an idea that was to bear fruit many years later – a description of a science-fiction-like general-purpose tool to help us keep track of what we know.
Looking toward the postwar world, Bush foresaw that recent breakthroughs in science and technology were going to create problems of their own. With all these scientists producing all this knowledge at an unprecedented rate, how was anyone to keep track of it all? How would this rapidly expanding body of knowledge benefit anybody if nobody knew how to get the information they needed?
“The summation of human experience is being expanded at a prodigious rate, and the means we use for threading through the consequent maze to the momentarily important item is the same as was used in the days of square-rigged ships, “Bush
He urged men of science to turn their efforts to making the increasingly unwieldy accumulation of human knowledge more accessible to individuals.
But the future technology that Bush foresaw extended beyond the borders of science to the ordinary citizenry. The day was coming when not only scientists but ordinary citizens would be required to navigate through ever-more complicated realms of information. In the pages of the Atlantic, Bush proposed that a certain device should be developed, a device to improve the quality of human thinking. Because one of its functions was to extend human memory, Bush called his hypothetical machine amemex. But Bush was one of the first to see that rapid access to large amounts of information could serve as much more than a simple extension of memory. Although he described it in terms of the primitive information technologies of the 1940 s, the memex was functionally similar to what is now known as the personal computer – and more.
Some ideas are like seeds. Or viruses. If they are in the air at the right time, they will infect exactly those people who are most susceptible to putting their lives in the idea’s service. The notion of a knowledge-extending technology was one of those ideas. Fifteen years after Bush published hisAtlanticarticle,J. C. R. Lickliderpublished his article about making computers into a communication medium. But only five years after Bush’s article, Doug Engelbart, infected by the idea of creating a mind-extending tool, incubated his own ideas about how to use machines to augment human intelligence.
After the war, with an electrical engineering degree and his experience with radar, Engelbart found a job at Ames Laboratory in California, working on contracts for one of NASA’s ancestors, the National Advisory Committee on Aeronautics. After a couple of years at Ames, he asked a woman he met there to marry him.
“The Monday after we got engaged,” Engelbart remembers today, “I was driving to work when I was hit with the shocking realization thatI no longer had any goals.As a kid who had grown up in the depression, I was imbued with three goals – get an education, get a steady job, get married. Now I had achieved them. Nothing was left. “
Doug Engelbart tends to think seriously about things when he finds something worth thinking about. And his own life is certainly not exempt from being an object of serious thinking.While he drove along a two-lane paved road that is now a freeway, he reckoned that he had about five and a half million working minutes remaining in his life. What value did he really want from that investment?At the age of twenty-five, in December of 1950, he started to think about what new goals he might set for himself.
“I dismissed money as a goal fairly early in the decision process. The way I grew up, if you had enough money to get by, that was okay; I never knew anybody who was rich. But by 1950, it looked to me like the world was changing so fast, and our problems were getting so much bigger, that I decided to look for a goal in life that would have the most payoff for mankind. “
For several months after he made the decision to commit himself to an appropriately humanitarian enterprise, Doug searched for the right one. He contemplated his situation and skills and thought about the various kinds of crusades he might join. With his radar training, and what he was beginning to learn about computers, Engelbart was also looking for a cause that wouldn’t require him to retread his engineering education, or move too far away from his new home. He had a challenging job and a pleasant drive to work. Santa Clara Valley was still the world’s largest prune orchard, and the electronics industry had only recently moved out of a couple of garages in Palo Alto. The drive gave him time to think.
Ultimately, the kinds of crusades that appealed to him still didn’t satisfy his needs: there weren’t clear-cut ways of organizing one’s thoughts to run a crusade. He was an engineer, not a political organizer, and the world was becoming too complicated for anything but the most well-organized crusades. Suddenly, Doug recognized that he was running into the same fundamental issue over and over again.
Engelbart realized, as had Vannevar Bush, that humankind was moving into an era in which the complexity and urgency of global problems were surpassing time-honored tools for dealing with problems.He also began to understand, as did Licklider a few years later, that handling the informational by-products of problem-solving had itself become the key to all the other problems. The most important task no longer lay in devising new ways to expand our accumulation of knowledge, but in knowing where to look for the answers that were already stored somewhere.“If you can improve our capacity to deal with complicated problems, you’ve made a significant impact on helping humankind. That was the kind of payoff I wanted, so that’s what I set out to do. “
Although many of the details took decades to work out, the main elements of what he wanted to achieve came to him all at once: “When I first heard about computers, I understood, from my radar experience, that if these machines can show you information on punchcards and printouts on paper, they could write or draw that information on a screen.When I saw the connection between a cathode-ray screen, an information processor, and a medium for representing symbols to a person, it all tumbled together in about half an hour.
“I started sketching a system in which computers draw symbols on the screen for you, and you can steer it through different domains with knobs and levers and transducers. I was designing all kinds of things you might want to do if you had a system like the one Vannevar Bush had suggested – how to expand it to a theater-like environment, for example, where you could sit with a colleague and exchange information. God! Think of how that would let you cut loose in solving problems! “
After thirty often-frustrating years of pursuing a dream that the computer industry has long ignored, Doug Engelbart still can’t keep the excitement out of his soft voice and the faraway look out of his eyes when he talks about the prospects he foresaw at twenty-five, and has pursued ever since. But he’s not sure whether today’s generation of computerists, with all their fancy hardware, are getting any closer to the real issues.
Although history has proved him to be an accurate visionary in many ways, but perhaps a less-than-ideal manager of projects and people, and even his friends use the word “stubborn” in describing his attitudes about his theories, Doug Engelbart still wields the power of a quiet person. The magnetism of his long-envisioned goal is still strong for him, so strong that a good deal of it still radiates when he talks about it. In 1971, his friend Nilo Lindgren described him inInnovationmagazine:
When he smiles, his face is wistful and boyish, but once the energy of his forward motion is halted and he stops to ponder, his pale blue eyes seem to express sadness or loneliness. Doug Engelbart’s voice, as he greets you, is low and soft, as though muted from having traveled a long distance, as though his words have been attenuated by layers of meditation. There is something diffident yet warm about the man, something gentle yet stubborn in his nature that wins respect.
“He reminds me of Moses parting the Red Sea,” is the wayAlan Kaydescribes Engelbart’s gentle charisma. Of course, the original Moses never set foot in the promised Land. And he never had the reputation of being an easy man to work with.
In 1951, Engelbart quit his job at Ames and went to graduate school at the University of California at Berkeley, where one of the first von Neumann architecture computers was being built. That was when he began to notice that not only didn’t people know what he was talking about, but some presumably “objective” scientists were overly hostile. He started saying the wrong things to people who could affect his career, things that simply sounded strange to the other electrical engineers.
“When we get the computer built,” this young engineer kept asking, “would it be okay if I use it toteachpeople? I hook it up to a keyboard and get a person to interact with the computer? Maybe teach the person typing? “The psychology people thought it was great, but computers were hardly their department. The engineering people said, “There’s no way that kind of idea is going to fly.”
The interactive stuff was so wild that the people who knew about computers didn’t want to hear about it.Back then, you didn’t interact with a computer, even if you were a programmer. You gave it your question, in the form of a box of punched cards, and if you had worked very hard at stating the question correctly, you got your answer.Computers weren’t meant for direct interaction. And this idea of using them to help peoplelearnwas downright Blasphemy.
After he got his doctorate, Engelbart came to another one of those internally triggered decision points in his life that his dream continued to bring his way. Nobody in his department wanted to listen to talk about building a better way to solve complex problems, and he felt that he would have to construct a whole new academic discipline before he could begin the research he really wanted to do. The university, Engelbart decided, was a place to get his journeyman’s card, but not a place to follow his vision.
Thus, young Doctor Engelbart went to the commercial world, looking for an opportunity to develop electronic systems that would eventually help him do what he wanted in terms of augmenting human intellect, and would pay his room and board as he contributed to the development of marketable devices as well. Engelbart brought some of his ideas to a progressive young company down the road in Palo Alto. For a change, here were some people looking to the future. Not too much more than a decade out of electrical engineering school themselves, Bill Hewlett, David Packard, and Barney Oliver (their head of research and development) were enthusiastic about Doug’s proposal. A deal was offered. Engelbart drove home, elated. On the way home, in typical Engelbart fashion, Doug started thinking about it.
“I pulled the car over to the first phone booth and called Barney Oliver and said that I just wanted to check my assumption that they saw a future in digital technology and computers – which I thought was a natural path for their electronic instrumentation company to follow. I. had assumed that they knew that the ideas I proposed to them that afternoon were only a bridge to digital electronics. And Barney replied that no, they didn’t have any plans for getting into computers. So I said ‘Well, that’s a shame, because I guess it cools the deal. I have to go the digital route to pursue the rest of what I want to do. ‘”
“So my deal withHewlett-Packardwas called off,” Doug says, wrapping up the reminiscence with one of his famous wry smiles, adding: “the last time I looked they were number five in the world of computers.”
Doug kept looking for the right institutional base. In October, 1957, the very month of Sputnik, he received an offer from an organization in Menlo Park, “across the creek” from Palo Alto, then known as theStanford Research Institute. They were interested in conducting research into scientific, military, and commercial applications of computers. One of the people who interviewed him for the SRI job had been a year or two ahead of Doug in the Ph.D. program at Berkeley, and Doug told him about his ideas of getting computers to interact with people, in order to augment their intellect.
“How many people have you already told about that?” he asked Doug.
“None, you’re the first one I’ve told,” said Doug.
“Good. Nowdon’ttell anybody else. It will sound too crazy. It will prejudice people against you.”
So Doug kept quiet about it. For about a year and a half, he earned his living and learned the ropes in the think-tank business and thought about putting his ideas into a written proposal. Then he told his superiors that he was willing to work hard to pay his way at the institute but he really had to have a framework to cultivate his idea – an augmentation laboratory where people and machines could experiment with new ways of creating and sharing knowledge, or at least a project to describe exactly what an augmentation laboratory might be. There was some friction, but eventually he was given the go-ahead.
The U.S. Air Force Office of Scientific Research, ever vigilant for new knowledge about how humans operate machines, provided a small grant. Doug finally got what he wanted – the freedom to explore a field in which he still had no colleagues. “It was lonely work, not having anybody to bounce the ideas off, but I finally got it written down in a paper I finished in 1962 and published in 1963. “
Total silence from the community greeted the announcement of the conceptual framework Engelbart had thought about and worked to articulate for over a decade. But the few people who happened to be listening happened to be the right people.Bob Taylor, a young fellow at NASA who was one of the bright technological vanguard of the post-Sputnik era, one of the new breed of research funders who didn’t fear innovation as a matter of reflex, pushed some of the earliest funding of Doug’s project.
Fortunately, by that time another one of the few people who were able to understand Engelbart’s vision, J. C. R. Licklider, was moving ahead with his ARPA funding blitz. As a result of Licklider’s support, time-sharing was coming along rapidly. By the early sixties, some of the low-level hardware and software tools to build Doug’s dreamed-of high-level methodological and conceptual structures were being tested. Licklider and Taylor thought Engelbart was just the kind of forward-thing researcher they wanted to recruit for the task of finding new and powerful uses for the computational tools their research teams were creating.They were particularly Interested in the same paper of Doug’s that the mainstream of computer science had chosen to ignore.
The paper that attracted the attention of ARPA and met such a thundering silence from the wider community of computer theorists in 1963 was entitled “A Conceptual Framework for the Augmentation of Man’s Intellect.” In its introduction, Engelbart presented the manifesto by which he meant to launch an entire new field of human knowledge:
By “augmenting man’s intellect” we mean increasing the capability of a man to approach a complex problem situation, gain comprehension to suit his particular needs, and to derive solutions to problems. Increased capability in this respect is taken to mean a mixture of the following: that comprehension can be gained more quickly; that better comprehension can be gained; that a useful degree of comprehension can be gained where previously the situation was too complex; that solutions can be produced more quickly; that better solutions can be produced; that solutions can be found where previously the human could find none. And by “complex situations” we include the professional problems of diplomats, executives, social scientists, life scientists, attorneys, designers – whether the problem situation exists for twenty minutes or twenty years. We do not speak of isolated clever tricks that help in particular situations. We refer to a way of life in an integrated domain where hunches, cut-and-try, intangibles, and the human “feel for a situation” usefully coexist with powerful concepts, streamlined technology and notation, sophisticated methods, and high-powered electronic aids.
It was no accident that “hunches, cut-and-try, intangibles,” were listed early and “high-powered electronic aids” was listed last. Although he knew that widespread access to digital computers was the only means by which our society could make use of an augmented knowledge system, Engelbart also understood that the hardware was a low-level component of the total system he meant to augment. Human intellect usestools, but the power of the human mind is not itself limited to the tools the human brain automatically provides.
Our culture has given us sophisticated procedures for dealing with problems, procedures that augment our innate capacity for learning new things by giving us the benefit of what others before us have learned. These ways of doing things are the software that creates civilization. A member of a preliterate culture of the remote New Guinea highlands, for example, possesses the same innate mental capabilities as a Western city-dweller, but something else must be added to the repertoire of what that New Guinea highlander knows how to do before he can drive a car, check out a book from a library, or write a letter.
The “something extra” Engelbart emphasized, is not a property of the tool. It isn’t the nervous system of the individual that separates the “civilized” person from the “primitive.” To certain cultures that we deem primitive, the most sophisticated urbanite is decidedly lacking in the necessary survival skills. If the cultural situation of the previous paragraph were reversed, the same ignorance on the part of the displaced person would be evident: If you drop a lifelong New Yorker into the New Guinea Highlands, don’t expect him or her to know how to build a grass shelter or what to do in a tropical storm. Somebody who knows what to do in those situations has to teach survival skills to the newcomer, thus augmenting his or her innate capacities.It is here that the original augmentation of human intellect comes in – the tools and procedures that cultures make available to individuals:
Our culture has evolved means for us to organize and utilize our basic capabilities so that we can comprehend truly complex situations and accomplish the processes of devising and implementing problem solutions. The ways in which human capabilities are thus extended are here calledaugmentation means, and we define the four basic classes of them:
1. Artifacts- physical objects designed to provide for human comfort, the manipulation of things or materials, and the manipulation of symbols. ********
2.Language– the way in which the individual classifies the picture of his world into the concepts that his mind uses to model that world, and the symbols that he attaches to those concepts and uses in consciously manipulating the concepts (“thinking”).
3.Methodology– the methods, procedures, and strategies with which an individual organizes his goal-centered(problem-solving) activity.
4.Training– the conditioning needed by the individual to bring his skills in using augmentation means 1, 2, and 3 to the point where they are operationally effective.
The system we wish to improve can thus be visualized as comprising a trained human being together with his artifacts, language, and methodology. The explicit new system we contemplate will involve as artifacts computers and computer-controlled information-storage, information-handling, and information-display devices. The aspects of the conceptual framework that are discussed here are primarily those relating to those relating to the individual’s ability to make significant use of such equipment in an integrated system.
The biggest difference between the citizen of preliterate culture and the industrial-world dweller who can perform long division or dial a telephone is not in the brain’s “hardware” – the nervous system of the highlander or the urbanite – but in the thinking tools given by the culture. Reading, writing, surviving in a jungle or a city, are examples of culturally transmitted human software. The hypothetical transplanted native, Engelbart points out, can move step by step through an organized program by which he or she may learnto drive a car or check out a book from a library.
How do we adapt to new ways of thinking? Engelbart used the metaphor of atoolkit, and proposed that we organize our intellectual problem-solving tools in a hierarchy:
It is likely that each individual develops a certain repertory of process capabilities from which he selects and adapts those that will compose the processes that he executes. This repertory is like a toolkit. Just as the mechanic must know what his tools can do and how to use them, so the intellectual worker must know the capabilities of his tools and have suitable methods, strategies, and rules of thumb for making use of them. All of the process capabilities in the individual’s repertory rest ultimately on basic capabilities within him or his artifacts, and the entire repertory represents an integrated, hierarchical structure (which we often call therepertory Hierarchy).
As an example, Engelbart offered the process of issuing a memorandum – a task that involves putting specific information in a formal package and distributing it to other people. The reason for writing the memo, the memowriter’s role in the organization, the intended audience, the importance of the subject matter of the memo to the organization’s goals – these are the higher level components of the Hierarchy.
At an intermediate level are the skills of marshaling facts, soliciting opinions, thinking, formulating ideas, weighing alternatives, forecasting, making judgments, that go into framing the memo, and all the communication skills that go into putting the memo into form. Toward the bottom of the hierarchy are the artifacts used to prepare the memo and the medium by which it is communicated – typewriter, pencil, paper, interoffice mail.
Engelbart proposed a hypothetical method for boosting the effectiveness of the whole system by introducing an innovative technology into a relatively low level of the hierarchy. “Suppose you had a new writing machine,” he wrote, “a high-speed electric typewriter with some very special features. “In a few words, he proceeded to describe what is known today as a” word processor. “
What might be the effect of such a machine on the memo-writing process? Engelbart’s 1963 speculations sound like advertising copy for word processing systems of the 1980 s – and more:
This hypothetical writing machine permits you to use a new process for composing text. For instance, trial drafts can rapidly be composed from rearranged excerpts of old drafts, together with new words or passages which you insert by hand typing. Your first draft may represent a free outpouring of thoughts in any order, with the inspection of foregoing thoughts continuously stimulating new considerations and ideas to be entered. If the tangle of thoughts represented by the draft becomes too complex, you can compile a reordered draft quickly. It would be practical for you to accommodate more complexity in the trails of thought you might build in search of the path that suits your needs.
You can integrate new ideas more easily, and thus harness your creativity more continuously, if you can quickly and flexibly change your working record. If it is easier to update any part of your working record to accommodate new developments in thought or circumstance, you will find it easier to incorporate more complex procedures in your way of doing things. . . .
The important thing to appreciate here is that a direct new innovation in one particular capability can have far-reaching effects throughout the rest of your capability hierarchy. A change can propagateupthrough capability hierarchy, higher-order capabilities can now re organize to take special advantage of this change and of the intermediate higher-capability changes. A change can propagatedownthrough the hierarchy as a result of new capabilities at the high level and modification possibilities latent in lower levels. These latent capabilities may have been previously unusable in the hierarchy and become usable because of the new capability at the Higher level.
While Engelbart was, in fact, suggesting that computers could be used to automate a low-level task like typewriting, the point he wanted to make had to do with changes in the overall system – the capabilities such an artifact would open up forthinkingin a more effective, wider-ranging, more articulate, quicker, better-formatted manner. That is why he distinguished his proposed new category of computer applications by using the termaugmentationrather than the more widespread wordautomation.
From Engelbart’s point of view, the fact that it took over fifteen more years for word processing to catch on was not as important as the fact that people continue to myopically concentrate on the low-level automation and ignore the more important leverage it makes possible at higher levels.The hypothesis he presented in the 1963 framework was that computers represent a new stage in the evolution of human intellectual capabilities.Theconcept manipulationstage was the earliest, based in biological capabilities of the brain, followed by the stage ofsymbol manipulationbased on speech and writing, and the stage ofmanual external symbol manipulation, based on printing.
The computer-based typewriter was an example of the coming fourth stage ofautomated external symbol manipulation, to be brought about by, but not limited to, the application of computers to the process of thinking and communicating:
In this stage, the symbols with which the human represents the concepts he is manipulating can be arranged before his eyes, moved, stored, recalled, operated upon according to extremely complex rules – all in very rapid response to a minimum amount of information supplied by the human, by means of cooperative technological devices. In the limit of what we might now imagine, this could be a computer, with which individuals could communicate rapidly and easily, coupled to a three-dimensional color display within whichextremely sophisticated imagescould be constructed, the computer being able to execute a wide variety of processes on parts or all of these images in response to human direction. The displays and processes could provide helpful services and could involve concepts not hitherto imagined (e.g., the pregraphic thinker would have been unable to predict the bar graph, the process of long division, or card file systems).
. . . we might imagine some relatively straightforward means of increasing our external symbol-manipulation capability and try to picture the consequent changes that could evolve in our language and methods of thinking. For instance, imagine that our budding technology of a few generations ago had developed an artifact that was essentially a high-speed, semiautomatic table-lookup device, cheap enough for almost everyone to afford and small enough to be carried on the person. Assume that the individual cartridges sold by manufacturers (publishers) contained the lookup information, that one cartridge could hold the equivalent of an unabridged dictionary, and that a one-paragraph definition could always be located by the average practices individual in less than three seconds. What changes in language and methodology might not result?If it were so easy to look things up, how would our vocabulary develop, how would our habits of exploring the intellectual domains of others shift, how might the sophistication of practical organization mature (if each person could so quickly and easily look up applicable rules), how would our education system take advantage of this new external symbol-manipulation capability of students and teachers and administrators?
At the end of the 1963 paper,Engelbart proposed that the hypothesis should be tested by constructing an augmentation laboratory in which humans could use new information processing artifacts to explore the new languages, methods, and training made possible by the computer systems then coming into existence in Cambridge, Lexington, Berkeley, and Santa Monica.Since the ultimate product was to be for everyone, not just computer experts, people who were involved in editing, designing, and other knowledge-related fields would have to be recruited to join the electrical engineers and programmers. Because the goal was to enhance the power of the human mind, and to learn how to introduce such enhancements to human organizations, a psychologist would also be needed.
The laboratory itself would have to be a consciously designed bootstrapping tool, because the very tools this team would be constructing first were the tools needed to do their own jobs better. Before they could hope to augment other people’s tasks, they had to augment their own jobs.Bootstrapping – building the tools to build better tools, and testing them on yourself as you go along, was a central component of Engelbart’s strategy, intended to match the pace of anticipated developments in computer technology. SRI management had few illusions about obtaining the funding necessary to implement such a scheme.
In 1964 , Bob Taylor, who by that time had moved from NASA to ARPA, told Engelbart and SRI that the Information Processing Techniques Office was prepared to contribute a million dollars initially to provide one of the new time-sharing computer systems, and about a half a million dollars a year to support the augmentation research. It came as a surprise to Engelbart’s superiors, who were eager to procure government contracts for developing new computer technologies, but who didn’t exactly regard his grandiose plans for a mind-extending laboratory as their most promising candidate for large-scale funding. One can imagine the SRI brass pulling out the organization chart after the ARPA funders left, to find out who and where Doug Engelbart happened to be.
Here was the support Engelbart had been seeking for years, coming right at the point where the conceptual framework for the system had already been worked out and the technology he needed was becoming available. The next step was to assemble the team who would build the first Prototype.
Perhaps the Augmentation Research Center’s greatest effect on computer culture for generations to come was in the succession of remarkable people who passed through that laboratory and on to other notable research projects. Dozens of gifted individuals over the span of a decade dedicated themselves to putting into action the system Engelbart and Licklider had dreamed about in previous years. Many of those former Engelbart prot s are now leaders of their own research teams at universities or the R & D divisions of commercial computer manufacturers.
The Augmentation Research Center (ARC) consisted of the “engine room,” where the new time-sharing computers were located, a hardware shop where the constantly upgraded computer systems and experimental input-output devices were built and maintained, and a model “intellectual workshop “that consisted of an amphitheater-like space in which a dozen people sat in front of large display terminals, creating the system’s software, communicating with each other, and navigating through dimensions of information by means of what was known asNLS(for oNLine System).
NLS was an exotic and intoxicating new brew of ARPA-provided gadgetry, homebrewed software wizardry, and altogether new intellectual skillsthat were partially designed in advance and partially thrown together as the designer-subjects of the experiment went along. After four years of stumbling, backtracking, leaping forward, then more confidently exploring this new territory, after hardware crises and software crises and endless argumentation about how to go about doing what they all agreed ought to be done, NLS was beginning to fulfill the hopes its builders had for it.It was time to gamble.
Whenever he consulted the feeling in his stomach, Doug Engelbart had no doubt that it was a gamble. Sitting all alone on that stage in San Francisco, watching his support team scramble around the hastily woven nest of cables and cameras surrounding the base of the platform, facing an audience of several thousand computer experts, it was all too evident to Doug that any number of possible accidents – a thunderstorm, a faulty cable, a concatenation of software glitches – could effectively kill their future chances of obtaining research funds.
But he had begun to lose his patience, waiting for decades for the rest of the world to catch on to something as important as augmentation. And his colleagues shrared Engelbart’s confidence in the delicate coalition of people, electronic devices, software, and ideas they called the NLS system.
Doug’s painstakingly thought-out conceptual framework, the prototype hardware, systems he and Bill English developed, and his bootstrapping laboratory of systems programmers, computer engineers, psychologists, and media specialists were only corroborating what Doug had known for years – computers can help intellectual workersthinkbetter. By the late 1960 s, the problem lay in getting his ideas and the meaning of his team’s accomplishments across to people in the wider computer world.
The augmentation center, as planned, had grown to seventeen people by 1968. They were on their third upgraded computer system, and the software was evolving from the first crude experimental versions to a real working toolkit for information.In a matter of months, the SRI Augmentation Research Center was due to become the Network Information Center for ARPA’s experiment in long-distance linking of computers – the fabled ARPAnet.
In the fall of 1968, when a major gathering of the computer clans known as the Fall Joint Computer conference was scheduled in nearby San Francisco, Doug decided to stake the reputation of his long-sought augmentation laboratory in Menlo Park – literally his life’s work by that time – on a demonstration so daring and directthat finally, after all these years, computer scientists would understand and embrace that vital clue that had eluded them for so long.
Those who were in the audience at Civic Auditorium that afternoon remember how Doug’s quiet voice managed to gently but irresistibly seize the attention of several thousand high-level hackers for nearly two hours, after which the audience did something rare in that particularly competitive and critical subculture – they gave Doug and his colleagues a standing ovation.
The audience, in the same room where the first “computer faire” for microcomputer homebrew hobbyists was held some years later, witnessed a kind of media presentation that nobody in the computer milieu had ever experienced before. State-of-the-art audiovisual equipment was gathered from around the world at the behest of a presentation team that includedStewart Brand, whose experience in mind-altering multimedia shows was derived from his production of get-togethers a few years before this, held not too far from this same auditorium, known as “Acid Tests.”
Doug’s control panel and screen were linked to the host computer and the rest of the team back at SRI via a temporary microwave antenna they had set up in the hills above Menlo Park. While Doug was up there alone in the cockpit, a dozen people under the direction of Bill English worked frantically behind the scenes to keep their delicately transplanted system together just long enough for this crucial test flight. For once, fate was on their side. Like a perfect space launch, all the minor random accidents canceled each other. For two hours, seventeen years ago,Doug Engelbart finally got his chance to take his peers – augmentation pioneers and number crunchers as well – on a flight through information space.
Fortunately for the historical record, a film of the event was made. Those who were at the original event say that the sixteen-millimeter film is a poor shadow of the original show. During the original presentation, an advanced electronic projection system provided a sharply focused image, twenty times life sized, on a large screen. Doug was alone on the stage, the screen looming above and behind him as he sat in front of his CRT display, wearing the kind of earphone-microphone headsets that radar operators and jet pilots use, his hands resting on an unusual-looking control console connected to his chair.
The specially designed input console swiveled so he could pull it onto his lap. A standard typewriter keyboard was in the center, and two small platforms projected about six inches on either side. On the platform to his left was a five-key device he used for entering commands, and on the platform to the right was the famous “mouse”that is only now beginning to penetrate the personal computing market – a device the size of a pack of cigarettes, with buttons on the top, attached to the console with a wire. Doug moved it around with his right hand.
In front of him was the display screen. The large screen behind him could alternate, or share, multiple views of Doug’s hands, his face, the information on the display screen, and images of his colleagues and their display screens at Menlo Park.The screen could be divided into a number of “windows,”each of which could display either text or image. The changing information displayed on the large screen, activated by his fingertip commands on the five-key device and his motions of the mouse, began to animate under Doug’s control. Everyone in the room had attended hundreds of slide presentations before this, but from the moment Doug first imparted movement to the views on the screen, it became evident that this was like no audiovisual presentation anyone had attempted before.
Engelbart was the very image of a test pilot for a new kind of vehicle that doesn’t fly over geographical territory but through what was heretofore an abstraction that computer scientists call “information space.”He not only looked the part, but acted it: The Chuck Yeager of the computer cosmos, calmly putting the new system through its paces and reporting back to his astonished earthbound audience in a calm, quiet voice.
Imagine that you are in a new kind of vehicle with virtually unlimited range in both space and time. In this vehicle is a magic window that enables you to choose from a very large range of possible views and to rapidly filter a vast field of possibilities – from the microscopic to the galactic, from a certain word in a certain book in a certain library, to a summary of the entire field of knowledge.
The territory you see through the augmented window in your new vehicle is not the normal landscape of plains and trees and oceans, but an informationscapein which the features are words, numbers, graphs, images, concepts, paragraphs, arguments, relationships, formulas, diagrams, proofs, bodies of literature and schools of criticism. The effect is dizzying at first.In Doug’s words, all of our old habits of organizing information are “blasted open” by exposure to a system modeled, not on pencils and printing presses, but on the way the human mind processes information.
When the new vehicle for thought known as Arabic numbers was introduced to the West, and mathematicians found that they didn’t have to fumble with Roman numerals in their calculations anymore, the mental freedom must have been dizzying at first. But not nearly as dizzying as this. There is a dynamism of the informationscape that needs no explanation, that needs only to be experienced to be understood. In that sense, Doug knew he had no choice but to take the risk of putting it up on the big screen and letting his audience judge for themselves.
Even the chewing-gum-and-bailing-wire version Doug was attempting to get off the ground in 1968 had the ability to impose new structureson what you could see through its windows. The symbolic domain, from minutiae to the grandest features, could be arranged at will by the informationaut, who watched through his window while he navigated his vehicle and the audience witnessed it all on the big screen. Informational features were reordered, juxtaposed, deleted, nested, linked, chained, subdivided, inserted, revised, referenced, expanded, summarized – all with fingertip commands, A document could be called up in its entirety, or the view could be restricted to only the first line or first word of each paragraph, or the first paragraph of each page.
One of the example tasks he demonstrated involved the creation of the presentation he was giving at the moment, from the outline of the talk to the logistics of moving their setup to the Civic Auditorium. The content of the information displayed on the screen referred to the lecture he was giving at the moment, and the lecture referred to the information on the screen – an example of the kind of self-referential procedure that programmers call “recursion.”
Doug moved his audience’s attention through the outline by the way he manipulated their “views” of the information. His manipulations maneuvered the screen display and the audience’s consciousness through categories of information, zoomed down to subcategories, broke them into their atomic components, rearranged them, then zoomed back up the hierarchy to meet the vocal narration at a key point in the story, when the words on the screen and the words coming from the narrator merged before branching off again. It was an appropriately dramatic presentation of a then-novel use of computers. While it appeared to be a radically sudden innovation to many of those in the audience, it was the culmination of careful experimentation at ARC that had already spanned most of a decade.
It is almost shocking to realize that in 1968 it was a novel experience to see someone use a computer to put words on a screen, and in this era of widespread word processing, it is hard to imagine today that very few people were able to see in Doug’s demonstration the vanguard of an industry.When time-sharing systems first allowed programmers to interact directly with computers, in the early 1960 s, the programmers developed tools known as “text editors” to help them write programming code. (The first one at MIT had a hand-lettered sign that dubbed it “expensive typewriter.”) But “word processing” for non-programmers was still far in the future, despite Engelbart’s demonstration of its potential.
The quality of video display technology in 1968 was also amazingly primitive by today’s standards. The letters and numbers on Doug’s screen looked as if they were handwritten – closer to crude swaths “painted” onto a radar screen than the crisp pixels we are accustomed to seeing today on video display terminals.
In seeking a domain where a small success would mean a large boost in effectiveness, and where success would attract a large-scale research and development effort,Doug chose to augment the “humdrum but practical and important sorts of tasks” that occupy an increasing proportion of the people in our society: preparing, editing, and publishing documents.This area of document preparation and communication was but a small slice from the grand range of applications he envisioned, but it was one tool that the augmentation team itself needed immediately, and one that every laboratory and office in the world would want – as soon as people understood that computers weren’t just calculators.
The seventeen members of the Augmentation Research Center, Engelbart explained during their 1968 show, were attempting to create a medium that would be useful to the other ARPA computer researchers and eventually to anyone who works with information.At the same time, this was a behavioral science experiment as well as a computer systems experiment, because the project team would be the subjects as well as the architects of the research. Making computers do what they wanted was only the beginning. The really difficult work was adjusting themselves to new ways of working and thinking.
Consequently, one of the first projects was to create a system to make it easy for the members of the research team – and eventually for other intellectual workers – to compose, store and retrieve, edit, and communicate words, numbers, and graphics. “Text editing” had to become more amenable to non-programmers and more suited for the expression of thoughts and composition of prose.
They needed to invent display devices and adapt the computer and write the programs; then they had to use what they had invented to compose a description of the system. The hardware and software Specialist worked on representing symbols on screens and storing them in the computer’s memory. Then the Answers specialists used the text editors to write the manuals to instruct future members of the growing project in the use of new tools.
The text-editing system was the first stage of Doug’s long-term plan. The actual use of the system to design and describe the next generation was the second stage. Both stages were accomplished by 1968. Even as early as 1968, NLS was not limited to what we now call a word processing system. The third-stage goal was to build an entire toolkit for intellectual tasks, and develop the procedures and methods by which those tools could be used, individually and collectively, to boost the performance of people who did information-related work. The toolkit would then be used to develop new modes of computer-aided human collaboration.
Software was created to connect the text-editing system with a special kind of electronic filing arrangement that would serve as a unifying memory, record, and medium for their individual efforts.The softwarejournalthrough which individuals and groups could have access to a shared thinking and communicating spacehad been in development since 1965 – 1966; it enabled individuals to insert comments into the group record of the augmentation experiments (or browse through them), and enabled programmers to trace the way system features had evolved. The journal, along withshared screen telephoningto enhance real-time, one-to-one communications, was part of the overalldialogue support systemdesigned to help increase effectiveness of group communication and decision making.
The idea of the journal predated the development of computer networks and teleconferencing, originating as it did with a dozen terminals connected to a single multiaccess computer. It was an important first try at “reaching through” the toolkit to engage in communication with another human user of the system.It was a theoretical precursor to the “electronic mail” mediumthat was to evolve when the ARPA network became operational in the early 1970 s. When ARPAnet came along, connecting many computers in different locations into a shared computational “space,” it wasn’t such a shocking new medium to those few ARC pioneers who had been working on a smaller, localized version for years.
The journal was designed to bring order to a stream of dialogues, notes, and publications generated in the process of building the system and finding out how to work it. Besides serving as an electronic logbook that would be useful to human factors specialists and systems programmers, the journal was meant to be a medium for a formal dialogue among users that would serve the same purpose as today’s traditional libraries and professional journals – but would do so in such an amplified manner that it would become a uniquely powerful method of transmitting knowledge.
For example, scientific journals in every field follow a form in which a paper describing research results is refereed, then published, after which subsequent papers can cite the previous paper. The record in any field of scientific knowledge – and the forum in which the significance of findings is debated – consists of a growing list of journal citations and accompanying text. It takes time for new innovation and comments to circulate, and it takes a relatively long time for individuals to thread their way through a branching history of citations. In the NLS version, it is very easy to jump directly and quickly from any article to the text of cited articles and back – reducing to seconds or minutes procedures that would take hours or months in even the most efficient library / journal system.
Publication and distribution are radically changed by a computerized system, since it is so easy to automatically notify everybody on a certain kind of reading list material matching their interest profile is now available. Distribution lists can be members of distribution lists – you can designate a list to be the recipient of an announcement, and every member of the designated list will receive your message. Messages and articles can contain lists of citations, and catalogs and indices can be message forms of their own. Ideas and hypotheses could be conveyed by telling interested members of the community to read a certain list of cited articles in a particular order.
This more formal and highly structured kind of intellectual discourse is essential to science, but is not the usual mode of communication used in the day-to-day affairs of ordinary citizens.As Licklider and Taylor, Doug’s long-time colleagues and principal funders, pointed out in 1968, the new interactive computers and new intercomputer networks would make it possible to use tools like NLS to construct acomputer-aidedcommunityin which not only intellect but communicationcould be augmented.
At the most fundamental level, communication begins when two or more people need to share information, transact business, make decisions, resolve differences, reach agreements, solve problems, communicate plans. One of the early creations in the NLS collection of software levers and pulleys and skyhooks brought the other capabilities of the system to bear on communications. ARC developed a “mode of teleconferencing” whereby:
. . . two or more people, positioned at separated display consoles, can link their displays so that all see the same image, and at option any can exercise control. When simultaneously talking on the te lephone the resulting dialogue can be uniquely effective – corresponding to an in-person conference around a collective assemblage of their scratch pads, working records, and individual support facilities. . . .
But consider the great potential already existing when some of the participants – or even a single participant – can effectively use computer tools to work with the relevant materials and processes.There is a great value in merely conducting themselves as though they were congregated at a magic blackboard – each easily able to pull forth materials from his notes or familiar reference sources, copy across into his private workplace any material offered from what the other
In 1969, ARC became one of the original nodes of the ARPAnet system that connected defense-related research computers around the country into a network. The network, Bob Taylor’s brainchild, used common-carrier communication lines to interconnect computers in different parts of the country. While the separate time-sharing communities were busy exchanging data, programs, and messages, the ARC people saw their participation in the network as an opportunity to put their knowledge to good use, and to extend their experiment beyond their SRI laboratory to include everyone around the country who was connected to the network.
As the network grew, ARC branched out from its primary activity of continually redesigning itself. It began serving as the Network Information Center, offering referencing and organizing services for the distributed community of ARPAnet users.No longer languishing in a half-forgotten Quonset somewhere on the huge SRI grounds, the augmentation laboratory, equipped with the latest time-sharing hardware, was by 1970 the proud subject of VIP tours.
After so many years of solitary envisioning, Engelbart had become even more optimistic about the ultimate significance of their enterprise than he had been when he started. In the spring of 1970 he told his colleagues at the Interdisciplinary conference on multi-access Computer Networks:
. . . It has been my business to struggle with these concepts for two decades now, and the signs that I read at least tell me that the changes in our ways of thinking and working will be more pervasive and extreme than ANY OF US appreciates – a revolution like the development of writing and the printing press lumped together. . . .
It will take explorers of this domain decades even to map its currently visible dimensions. The real rush hasn’t begun: this Conference is a meeting of suppliers looking for the prospector trade; we haven’t really been giving attention to the developments that will follow the prospecting.
My research group is now moving into a next stage of work that we call“team augmentation. “Here, instead of just the individual facilitating his private domain searching, studying, thinking and formulating, as his office place provides for him, we are exploring what can be done for a team of “augmented individuals” who have in common a number of terminals, a set of computer tools, working files, etc. (as we do), to facilitate their team collaborations.
The problem-solving assistance Engelbart had dreamed about alone in the 1950 s became the “integrated working environment” he proposed in 1963, which in turn grew into the toolbuilders’ toolkit that he and his small group of colleagues used to build an “intellectual workshop” throughout the remaining seven years of the decade. By the early 1970 s, the wider community of ARPA -funded computer researchers and representatives of the business world were joining the bootstrapping process. Paradoxically, just when their leader decided that “team augmentation “would be their goal, his own team began to react negatively to growing pressures – technological, psychological, and social.
Doug had always warned that “the larger augmentation system is much more complex than the technological ‘subsystem’ upon which it depends,” and the 1970 s were the era when ARC began to practice what Engelbart had preached. During the first decade of the laboratory’s existence, computer technology had progressed at an astonishing pace, and the SRI crew were doing their utmost to use the innovations as quickly as they came along.
The “rule of two” (that computer power would double every two years) and the Engelbart-induced zeal of the augmentation team kept them fueled for an effort to bootstrap and continually adjust themselves to the capabilities of their upgraded tools – an effort that required extraordinary intensity. The bootstrapping and readjusting continued with unabated enthusiasm, at least until the early 1970 s, when the idea of building a system that was meant to “transcend itself every six to eight months” to keep pace with hardware and software advances turned out to be more pleasant to contemplate than to carry out. It had been a challenging and exhilarating to build this new system for augmenting thought – but it wasn’t as much fun having one’s work habits augmented at a forced-march pace.
When both the old-timers and newcomers to the growing project faced the task of learning new roles, changing old attitudes, adopting different methods, on regular basis, just because the system enabled them to do so, the great adventure became more arduous than any of the ARC pioneers / experimental subjects had anticipated.So a psychologist was brought in to consult about those parts of the system that weren’t found in the circuitry or software, but in the thoughts and relationships of the people who were building and using the system.
Dr. James Fadiman joined ARC as an observer-catalyst-therapist. Fadiman was particularly interested in the ways human consciousness and behavior change in new situations, and it didn’t take him long to realize that the process of “being augmented” was in fact a new, nonchemical form of altered consciousness.
Several of the things Fadiman learned about the “augmentation experience” have taken more than a decade to filter out to people who design computers for nonexperts. One thing he learned almost immediately was that most people resist change, especially in the workplace, and resistance works both ways – people who are resistant to learning an augmentation system are equally resistant to giving it up once they have adopted it. The initial resistance is partially grounded in a general fear of the unknown.
Doug Engelbart, of course, saw these things on his own scale, and through the eyes of an engineer. There would be rough spots, software and interpersonal bugs, arguments and conflicts, to be sure – but the master plan was progressing nicely, considering all those years he had worked alone. The toolkit had become a workshop, and they knew the workshop indeed worked because they had been their own guinea pigs for a decade.
In the same 1970 address in which he referred to the multiaccess computing community as a “meeting of suppliers looking at the prospector trade, “Engelbart also predicted that the future would see” a steadily increasing number of people who spend a significant amount of their professional time at terminals, “and speculated that the future of dispersed personal augmentation systems linked together into network communities would create new kinds of societal institutions: “In particular, there will emerge a new ‘marketplace,’ representing fantastic wealth in commodities of knowledge, service, information, processing, storage, etc. “
In his usual forge-ahead manner, Engelbart was already bringing members of the business community into the ARC experiment. Business managers and management scientists had been working at ARC, experimenting with using NLS tools to manage the steadily growing ARC project. In proper bootstrapping style, they looked at their attempts to apply the system to their own research management as yet another experiment. Richard Watson and James C. Norton worked closely with ARC to develop their experimental discoveries into a system that would be usable by people who were not computer experts but whose occupations involved the manipulation of information.
Sometime in the early 1970 s, Engelbart was inspired by a book, just as he had been enthused by magazine articles by Bush and Licklider in years past. This time, it was the theory proposed by business management expertPeter Druckerin the late 1960 s. Knowledge, by Drucker’s definition, is the systematic organization of information;a knowledge workeris a person who creates and applies knowledge to productive ends. The rapid emergence of an economy based primarily on knowledge, Drucker predicted, would be the most significant social transformation of the last quarter of the twentieth century.
Drucker noted something about the future of knowledge in the American economy that seemed to converge, from an unexpected but not unpredictable direction, with the course Engelbart had plotted for the augmentation project at the beginning of its second decade. Drucker was one of the first of a growing number of social scientists who have claimed that an examination of labor statistics reveals a great deal about the role of knowledge work in everybody’s future.
In 1973, ten years after his solo ” Framework., “Engelbart, Watson, and Norton presented a paper on” The Augmented Knowledge Workshop ” to the National Computer Conference. Acknowledging their debt to Drucker’s ideas, the authors pointed out that the special computer systems that had been evolving at ARC were designed to alleviate the problems associated with “the accelerating rate at which knowledge and knowledge work are coming to dominate the working activity of our society ‘:
In 1900 the majority and the largest single group of Americans obtained their livelihood from the farm. By 1940 the largest single group was industrial workers, especially semiskilled machine operators. By 1960, the largest single group was professional, managerial, and technical – that is, knowledge workers. By 1975 – 80 this group will embrace the majority of Americans. The productivity of knowledge has already become the key to national productivity, competitive strength, and economic achievement, according to Drucker. It is knowledge, not land, raw materials, or capital, that has become the central factor in production.
Noting Drucker’s use of terms such as “knowledge organizations” and “knowledge technologies,” Engelbart, Watson, and Norton specified an augmented knowledge workshop that was nothing less than a totally redesigned working environment for everybody in the “knowledge sector.” The authors acknowledged thatordinary knowledge workshops – offices, boardrooms, libraries, universities, studios – have existed for centuries. Augmented knowledge workshops, however, existed only as prototypes, and would not come into widespread usage until the technologies pioneered at ARC (and by then, at a new place across the creek, calledPARC) grew economical enough to sell as office equipment.This was the origin of an idea that was later adapted by others in a truncated version known as “The Office of the Future.”
The authors described the technology they had built and used for augmenting their own knowledge as individuals and in groups, but emphasized that the tools were only the first part of a total transformation of the system – including changes in methods, attitudes, roles, lifestyles, and working habits. They knew from their own experience that the psychological and social adjustments would be the most intense and volatile changes set off by the introduction of these systems into existing organizations.
In 1975, after twelve years of continuous support, ARPA dropped ARC. The staff quickly shrank from a high of thirty-five to a dozen, then down to a few, and finally down to Doug Engelbart and a large amount of software. A decade of useful work is an unheard of length of time in the hyperaccelerated world of software technology, but bootstrapping had kept NLS continually evolving as it expanded its usefulness, as it moved up to machines with larger memories and faster processors, and as the community thought of new things to do with it.
Even before ARPA drastically reduced its funding, ARC had started a subscription service to several corporations who wanted to experiment with using the services of the augmentation system. The way Engelbart saw it, it was time to bring the system out of the research world, after its extended gestation, to test it on a community of real-world users. The way SRI saw it was that the whole project was obviously finished as a magnet for research funds, and they might as well sell it. In 1977, SRI sold the entire augmentation system to Tymshare Corporation, and Engelbart went with it. The system, renamed “Augment,” is now marketed by Tymshare as one of their office automation services.
Nobody disputes that Engelbart’s vision was the single factor that stayed stable through twenty of the most turbulent years of computer science, and those few colleagues who know of his importance to the evolution of computing are loathe to speak unkindly of him, yet the tacit consensus is that Doug Engelbart the visionary allowed himself to remain fascinated by an obsolescent vision. NLS was powerful but very complex, and the notion of a kind of knowledge elite who learned complex and difficult languages to operate information vehicles is not as fashionable in the world of less sophisticated but more egalitarian personal computers created by Engelbart’s students.
The twelve years of ARC’s heyday at SRI, from 1963 to 1975, were technologically wild years. That period was one of enormous historical, social, and cultural upheavals, as well. Mistakes, conflicts, blind alleys, and other pitfalls were unavoidable during the course of a project that began in the Kennedy administration and continued throughout the years of the Vietnam war, campus revolts, assassinations, the emergence of the counterculture, the advent of women’s liberation, Watergate, and ended during the Carter administration.
As individuals, and as a group, ARC wasn’t immune to the conflicts that affected the rest of the culture, although it was privy to its own mutated forms of it. Before the counterculture made its media splash and thousands of affluent American offspring started acting weird and growing their hair long,places where powerful computers were to be found had already spawned their own brand of weirdo – the hacker. The advent of this new subculture within the computer subculture was not the direct cause of ARC’s downfall, but it was symptomatic of the problems Engelbart faced in the 1970 S.
Engelbart found himself caught between the conservatism of his employers and the radicalism of his best students. ARC had seemed a bit strange to the old-line data-processing types at SRI, and these new people hanging out at Doug’s lab added cultural as well as technological differences to an already strained relationship. To say that SRI is conservative is an understatement. Although some of the subjects their researchers pursue can be unorthodox, their clients are such straitlaced institutions as the Defense Department, the intelligence community, and the top one hundred corporations.
Hackers were barely tolerated in the long, clean, high-security halls of SRI. But when the counterculture started to infiltrate, and the rumors started about some of the hackers augmenting their consciousness in more ways than one, SRI brass became extremely uncomfortable.
There was trouble from within, as well as from above. Some of the experiments in “new-age” social organization, encouraged by Engelbart himself, threatened to split the ARC group into two camps – those who were still techies at heart, concerned only with the advancement of the state of computing art, and those who saw augmentation as an integral part of the wider countercultural revolution that was going on around them. And there were those who felt that even Doug’s technological ideas, although they might have once been radical and futuristic, were becoming outmoded. The idea of augmentation teams and high-level time-shared systems began to seem a bit old-hat to the younger folks who were exploring the possibility of personal computers.
In the early 1970 s, some of Engelbart’s first and most important recruits, who had helped him create the first NLS system, left SRI for PARC, the new research center Xerox was putting together. The new Xerox facility was a hotbed of augmentation-oriented thought, but with a major difference – the advent of large-scale integrated circuitry made it possible to dream of, and even design, high-powered computers that could fit on an individual’s desk.This emphasis on one person, one computer made for important philosophical and technical differences with Engelbart’s approach.
For a while, Engelbart at SRI and his former students at Xerox were engaged in collaboration, but eventually PARC and ARC drifted apart. Doug still dreamed of creating augmentation centers in universities and industries, providing a service for any team of people who worked with information. The former ARC members were looking forward to an even wider potential computer-using population. The idea at Xerox was to use the new integrated circuit technology to create computers more powerful than the previous generations of minicomputers – and to devote an entire computer to each person, instead of sharing it among thirty or forty users.
PARC, as we shall see, went on to become the new mecca for those who saw the computer as a tool for augmenting the human intellect. ARC never seemed to make it to the promised land, and the former point-man for radical technology seemed to be more and more isolated in an interesting but less than influential backwater. As more and more of Engelbart’s earlier dreams became realities in other institutions, this judgment seemed to be less than fair. It is impossible to tell if there would have been a PARC if there hadn’t been an ARC, and while the miniaturization revolution made personal computers inevitable in a technical sense, there is good reason to question whether the kind of personal computing that exists today would ever have been developed if it had not been for the pathfinding work accomplished by Engelbart and his colleagues.
Doug Engelbart and the people who helped him build ARC did not succeed in building a knowledge workers’ utopia. Some hackers do seem to be pathologically attached to computers. These facts might have very little to do with the way other people will use the descendants of the tools they created. In fact, if you think about it, some of the wildest and woolliest of the MAC and ARC hackers were following in a long tradition of people who weren’t exactly run-of-the-mill citizens – from Babbage and Lovelace to Turing and von Neumann.
It must be remembered that MAC and ARC were only part of a larger effort to raise computing to a whole new level, and hackers weren’t the only scientist-artisans involved in that effort. Whatever future historians decide about the personalities of the people involved in carrying out this unprecedented exercise in planned breakthrough, they will have to considerthe role of the hackers who created time-sharing, computer networks, and personal computers in the 1960 s and early 1970 s, not out of sick obsession or in-group frivolity, but out of a serious desire to construct a new medium for human communication.
For the time being, Doug Engelbart still works away at his original goals, adapting the core of NLS to the new kind of computers that have come to use in the 1980 s. To Tymshare Corporation’s customers, the Augment system seems less science-fiction-like and more practical in this age of office automation.People in the business world are beginning to pay attention to what Doug is saying, for the first time since he started saying it, decades ago.
Still, Doug is neither rich nor famous nor powerful – not that these were ever his goals.All he seems to hunger for is all he ever hungered for – a world that is prepared for the kind of help he wants to give. Ironically, his office at Tymshare in Cupertino, California, is merely blocks away from the headquarters of Apple Corporation, where icons and mice and windows and bit-mapped screens and other Engelbart-originated ideas are now part of a billion-dollar enterprise.