The Future According to Atari (and ARPA)

A mother and her children looking into a tidepool in Laguna ask the Intelligent Encyclopedia about the plants and animals that they see. [Notice the antenna for cellular communication.]

A mother and her children looking into a tidepool in Laguna ask the Intelligent Encyclopedia about the plants and animals that they see. [Notice the antenna for cellular communication.]

 

In April of last year, Bob Stein of the Institute for the Future of the Book posted an interesting blog entry about the work Alan Kay and himself did at the Atari Research Group in 1982. The most interesting thing about that post wasn’t so much the actual content but rather the response in the comments section from Alan Kay himself:

Hi Bob (and visitors) People reading this should realize that there was nothing new to be thought up to make these scenarios (they were for Warner execs who were not sophisticated about computers despite having bought Atari). The ideas were all drawn (pretty much without exception) from the visions and demonstrations of the ARPA-IPTO research community in the 1960s, ca 1968. Main sources were Licklider, Taylor, Doug Engelbart, Nicholas Negroponte, Ivan Sutherland, Seymour Papert, some of my ideas back then (such as the wireless tablet computer), and many others from our colleagues. I’m not sure what Bob means by “missing that it was going to connect people to other people”. This was one of the main reasons for all the work we did at ARPA and at Xerox PARC on networking, including ARPAnet and Internet, and both wired and wireless. This was the subject of a very good white paper from 1968 by the founder of IPTO and one of his successors (JCR Licklider and Bob Taylor) “The computer as a Communication Device” (the second paper in this pdf: http://memex.org/licklider.pdf). It was one of the concerns of Bob Barton, the great computer designer when I was in grad school in the 60s. It was the subject of a whole day discussion at the first ARPA grad student conference in 1968 as we were in the process of building the ARPAnet. It was the main concern of Engelbart, who showed in “the mother of all demos” in 1968 many ways to do communication in many modes of time including real-time and face to face. And many kinds of such communication was happening in the ARPA and PARC communities in the 70s. The problem was the 1980s and what was lost by the gold-rush to commercialize subsets of these ideas (to an extent that in many cases was like the carpetbaggers). I t was certainly missed in the 80s by IBM and even Apple. But not at Atari — this was just an implicit part of the “ARPA Dream”. McLuhan warned us (using books and TV) indirectly about how strongly people were going to try to regain some sense of identity via an electronic global village. The hope by all of us from the 60s was that education — and The Encyclopedia Britannica whom Bob and I tried to get to understand at length what was going to happen — would help to create a sense of the real value here. But this didn’t happen, and we wound up with a pop culture. We can see this so easily by looking at comments on these pictures on various blog sites. I could only find one person who was unlazy enough to see if there were other opinions about these ideas (and did they really come from 1982 or earlier). That’s a pop culture, mostly trying to admire itself in every shiny surface it can find, and leaving behind the equivalent of “I was here” graffiti. Best wishes, Alan

In a bar, the two men at the right are watching football on the screen and running what-if simulations on the countertop Intelligent Encyclopedia which second guess the quarterback. The couple on the left is taking an on-the-spot course in wine connoisseurship.

In a bar, the two men at the right are watching football on the screen and running what-if simulations on the countertop Intelligent Encyclopedia which second guess the quarterback. The couple on the left is taking an on-the-spot course in wine connoisseurship.

 

Some background information: Sometime between 1973 and 1975 (varying dates exist from various sources in print and on the Web, usually given as either 1973 or 1975; most likely the firm had been contracted in 1972, and purchased outright by 1975), Atari hired a private consulting firm called Cyan Engineering to figure out how to create a home video system. At the time, Atari was strictly a developer and manufacturer of arcade games. By 1975 they had released four such games, with Pong being their most well known game. A home version of Pong had been released during Christmas of ’75 exclusively through Sears and by that time Nolan Bushnell was looking to create a home console that was capable of playing all four of Atari’s arcade games. Up this point, Atari had not been much of an innovator. Pong was an almost exact copy of the Magnavox Odyssey table tennis game, except that it was an arcade game, rather than a home console game. The Odyssey was not only the world’s first home console, having been released in 1972, but it was also the first game system of any kind capable of playing more than one game,  and it did this by making use of individual game cartridges. However, it was a pure analog device, and as such did not feature a microprocessor, using a number of different electronic components to output the video signal and allow it to be manipulated by the user. Utilizing overlays that were placed upon the television screen, it was possible to create a level of illusion for the players that they were playing video versions of games like roulette, hockey, skiing, etc. The Odyssey was extremely primitive and so badly marketed that it was discontinued in 1975, the same year the home version of Pong came out. While Pong was an improved version of video table tennis, it was also extremely primitive, being an analog device. From a technical standpoint, none of these games were impressive, let alone state-of-the-art. By this time, Cyan Engineering had been absorbed into Atari and was renamed Atari Grass Valley Research Center. Bushnell tasked them with figuring out how to create a home console that was capable of playing all four of Atari’s arcade games. One of these games, Tank, while featuring primitive graphics, utilised an advanced IC-based ROM. The Grass Valley engineering team came up with a prototype system utilising the MOS Technology 6507 microprocessor, a much lower-cost version of the 6502 that made a viable, advanced home console possible. This prototype was eventually named the Video Computer System and released in 1977. [As a quick aside, while for the most part the Wikipedia articles on these subjects are more or less correct, they do contain a number of glaring errors and omissions. As is the case with any Wikipedia article (and articles from any generalised encyclopedia or resource), it is recommended that other more reliable and subject specific sources be consulted for the exact facts.]

The success of the Atari Grass Valley Research Center team in developing the Atari VCS prototype led Bushnell to sell his company to Warner Communications (today’s Time-Warner) for $28 million, which gave Atari enough capital to not only mass produce the VCS on a large scale for the consumer market, but to also inject even more capital into research efforts. The Grass Valley team had suggested that the VCS had, at most, a lifespan of about three years, correctly speculating that within that timeframe computer technology would advance rapidly enough to make the VCS appear like a primitive device by 1980. As a result, research into a next generation console began long before the VCS was even released in 1977, and a complete design for the successor system had already been prototyped that year and hardware development continued in earnest through 1978. However, Warner had other plans. In 1977, Apple, Tandy, and Commodore had released their flagship consumer systems: the Apple II, TRS-80, and Commodore PET, respectively. Warner executives wanted to enter the home computer market and delay the release of a successor to the VCS, in effect extending the life cycle of the console well beyond its planned obsolescence. Bushnell and the Grass Valley engineers wanted the VCS retired by 1979 and the newer, more advanced system released by 1980. This and other disputes with Warner management led to Bushnell being forced out in 1978. Ray Kassar, the new CEO of Warner, wanted to position Atari as a major competitor to Apple and directed the Grass Valley team to adapt the new technology they had been developing into a home computer system, rather than a video game console. The result was the Atari 8-bit family of computers, which included the Atari 400 and 800 systems. Both were released in 1979 and were incredibly more advanced than equivalent 8-bit systems from Apple, Commodore, and Tandy. However, Atari’s systems were associated with video games, and it didn’t help that very few third-party developers existed, let alone anyone developing productivity or educational applications. While Tandy dominated the ultra-low-cost market, Apple and Commodore dominated the productivity and education markets. Atari naturally was most popular among gamers because of its vastly superior graphics and sound capabilities, but also became the system of choice for many hackers and programming enthusiasts because of its versatile nature. However, the gaming market wasn’t enough to help Atari gain any sort of lead in market share over its competitors, and coupled with the high price tag of its computers due to the exceptionally high quality construction and assembly of its units, Warner executives decided to release the successor system to the VCS (since renamed the Atari 2600) in 1982, called the Atari 5200. Since the original prototype of what was to become the 5200 had already been used as the basis of the Atari 400/800 computers, the 5200 was thus a console version of those computers, lacking a keyboard, RAM slots, I/O ports, and other such standard features of home computer systems. By this point, Warner’s strategy regarding the home computer market shifted toward a more long-term approach, refocusing their efforts on maintaining their dominance in the video game console market, in which they had begun losing significant market share to Mattel and Coleco, in order to stablise the company economically and insure a steady and growing level of profitability to tide things over until Atari would be in a position to take the lead over Apple and Commodore. Warner executives had made a number of significant blunders since taking over Atari, which directly led to the video game crash of 1983 (out of the ashes of which Nintendo and Sega would begin their ascendancy over the console market in 1985 and 1986, respectively), ultimately leading to their decision to sell Atari’s home console and computer divisions to Jack Tramiel of Commodore in 1984, keeping only the arcade game division under the Warner umbrella.

One mistake they did not make was to recognise not only the significant position that Apple had in the late seventies and early eighties, but to also recognise the potential supremacy that company could have over not only the computer market, but also over general consumer electronics. Warner’s management had closely observed the fact that Steve Jobs and his elite team of engineers had visited Xerox PARC and studied the amazing things PARC researchers had been working on that were rumoured about endlessly around Silicon Valley. As a growing multimedia conglomerate, Warner had set out an ambitious program to enter into consumer electronics and prevent Sony, a Japanese conglomerate, from further expansion in the United States. The seventies and eighties were a period of intense Japanophobia  amongst American companies who feared that Japan’s skyrocketing economic growth would inevitably dominate America and displace it as the leading economic power of the world. Sony, in particular, was growing fast and it clearly had designs on acquiring American media companies as part of its global expansion strategy. Atari was the first step in Warner’s plan to enter consumer electronics, and it is not without coincidence that the world “atari” itself is Japanse, meaning “to hit the target” or “to receive something through great fortune.” Having set their sights upon Apple in the late seventies, and by 1980 sensing that there was more going on there than just home computer development, Warner significantly expanded Atari’s research arms in 1982 with the establishment of the Advanced Sunnyvale Research Lab in Sunnyvale, led by former PARC Chief Scientist Alan Kay, and additional labs in Los Angeles and Cambridge, Massachusetts. Each of these labs, including the original Grass Valley center, was tasked with a specific mission, all of it revolving around next generation technologies that would lead the market seven to ten years down the line. The thinking at Warner was that Atari would be in much better position to compete with Apple and Sony (the IBM PC platform and Microsoft had not exactly been on their radar) in the near future rather than at the present moment. After all, Atari had started as a video game company and Warner management had zero experience in the technology sector. While it is commonly assumed by contemporary video game historians that Warner executives were a tad bit impetuous and arrogant in their assumptions about video games and technology, the reality is that they were aware enough of their shortcomings to delegate the task of figuring out what sort of company Atari should be in the future to the engineers and researchers at Atari Research. The most important of these labs was the one in Sunnyvale, led by Alan Kay. Kay was a key member of the PARC research staff, having worked on such revolutionary technologies as object-oriented programming, networking, and the graphical user interface. The latter of these technologies, the graphical user interface, or GUI, would form the foundation upon which the Lisa and Macintosh computers were developed at Apple. Warner was interested in all of these ideas, but they wanted Kay and his team to work with these concepts as a unity, as had been done at ARPA and PARC in the early seventies, and even earlier, going back to Vannevar Bush and his pioneering concept of the “memex” in 1945. The memex was the hypothetical precursor of hypertext, the World Wide Web, Google and Wikipedia. The memex went even further than all of these, however, in its concept of all-encompassing, all-consuming encyclopedia of knowledge, that would unify all possible information, including all human communications, books, music, and everything else that could be recorded and digitally stored, under a single, searchable system. It was, in effect, to be the total sum of collective human knowledge, memory, and experience in a unified and open system. Conceptually, the memex had an immense influence on government technologists and researchers at ARPA, who began developing the ARPANET in 1969. The ARPANET was the original version of the Internet, and it is obvious that having all human knowledge, communications, and art stored in a universal database would be of immense strategic and economic value to governments and corporations. The researchers themselves were more interested in the social value of such a vast network and the potential for universal human education and communication across physical boundaries.

It was this ideal that drove Alan Kay and his team at Atari to develop the idea of an “Intelligent Encyclopedia,” and which Warner was most interested in. The potential appeal of a universal device that could connect wirelessly to a central network, granting access to ordinary people to a potentially limitless resource of information and allowing people to communicate with each other digitally and store and share every kind of information—it would be nothing short of explosive. But, it was obviously many years down the line. If Atari had not imploded in 1983 due to internal mismanagement on the part of Warner, it could be speculated that Atari would have become the consumer electronics powerhouse that Apple became after the introduction of the iPod. After all, Apple was built upon visions and ideals that were developed by researchers and engineers at ARPA, funded by the United States government, and more specifically the Department of Defence, and further developed at PARC. While the first incarnation of the Macintosh was released in 1984, it wasn’t until the release of OS X in 2001 that Apple had finally achieved what Steve Jobs had longed dreamed of, to make what he had seen at PARC a viable commercial reality. We rarely, if ever, hear about Vannevar Bush, a government scientist and engineer, and one of the principal founders of the defence contractor Raytheon, a key figure of the Manhattan Project and one of the key figures behind the founding of the National Science Foundation, and how it all began with him. More than being a “Father of the Internet,” Bush was a “Father of the Future.”

Alan Kay rightly states what should be obvious to most, but is apparent to only a few, that pop culture is “mostly trying to admire itself in every shiny surface it can find, and leaving behind the equivalent of ‘I was here‘ graffiti.” Pop culture suffers from a distinct lack of critical thinking, and is intensely self-referential and lacking in any sense of history. It is pure vapidity, existing only for the moment, and for nothing else other than its own amusement. Pictures of the original illustrations by Glenn Keane, commissioned by Atari Research, can be viewed on our Pinterest board: http://pinterest.com/lexander/atari-research/.

Leave a Reply