Artigo Acesso aberto Revisado por pares

The Re-Wiring of History

2009; Queensland University of Technology; Volume: 12; Issue: 3 Linguagem: Inglês

10.5204/mcj.148

ISSN

1441-2616

Autores

Greg Shapley,

Tópico(s)

History of Computing Technologies

Resumo

Introduction The history of the human being is the history of technology—of the production of interfaces to interact and communicate with each other and the rest of the physical world. The concept of the cyborgian-us as prescribed by Donna Haraway in the eighties is now so entrenched in cultural studies, and art and media theory, that it has become a cliché (475-496). From the use of the first tool, the wearing of the first shred of clothing, we and our technologies became inseparable—indistinguishable. There is no point at which we end and our machines begin. We eat, sleep, and move with technology, and our thoughts are intertwined with its daily machinations. If there was any doubt as to our symbiotic relationship, the Internet and advances in genetics have annihilated it. As Hardt and Negri put it more recently: “Tools have always functioned as human prostheses, integrated into our bodies through our laboring practices as a kind of anthropological mutation both in individual terms and in terms of collective social life” (217). If there was any doubt as to our symbiotic relationship, the Internet and advances in genetics have annihilated it. “The moment we invent a significant new device for communication—talking drums, papyrus scrolls, printed books, crystal sets, computers, pagers—we partially reconstruct the self and its world, creating new opportunities (and new traps) for thought, perception, and social experience”, says Eric Davis, in his book Techgnosis (4). Technology, however, is also the ultimate victim of the cult of the new. Old technology is meant to disappear—made redundant and invisible by whatever is new and shiny. There can be nothing that we can learn from it that hasn't already been incorporated in the latest thing. Thus when a new device emerges that has the same or similar function to an existing device, the history of the older piece of equipment is reworked, or rewired into the new. Like Russian dolls or onion skins, one subsumes the other. There can be nothing differentiating these devices, except that one has been consigned to the scrap heap. (Two exceptions exist: the fetishisation of technology, where certain traits of older technologies are given celebrity status, such as the modern phenomenon of using black and white in movies, or the resurgence of analogue, or analogue-sounding instruments; and the second, the sentimentalising of technology, where it is given a pat on the head, placed on a shelf and swooned over as a reminder of better (or at least more interesting) times gone by—thus accounting for the otherwise unfathomable popularity of television shows like Collectors and Antiques Roadshow.) The history of technology then becomes very undialectical. If old technology is written out of existence because the current model has subsumed its identity and character, then the importance of the older equipment to human development is lost. This teleological history ignores the material conditions and conflicts surrounding older technologies as well as any unique qualities that have been shelved along the way due to convenience, or commercial or other concerns. Media theorist, Brian Winston, has coined these external forces that act upon technology ‘supervening social necessities’ (Murphie and Potts 20). There are some obvious examples of these forces that I will return to later, like the thousands of home computers that became available during the seventies and eighties that weren't IBM compatible PCs or Apple Macs. Then there are less obvious inventions such as the pre-digital facsimile machine. The Analog Fax Facts It is a little known fact that the fax machine did not eventuate as a result of the digital revolution in the 1970s. In fact, we have to go back well over a century, to the 1840s, to discover its origins. According to Jennifer Light's excellent descriptive (but not technical) history of the facsimile, Alexander Bain, a Scottish electrician, was the first to create, what he called, an “automatic electrochemical recording telegraph” (357). Keeping in mind that Alexander Bell was still about 50 years off from uttering the famous sentence “Mr Watson — Come here — I want to see you”, a slew of other similar inventions followed, some used telegraph lines while others employed wireless technologies (Bruce 181). I don't wish to simply reiterate Light's revealing paper, but it is nevertheless worth attempting to summarise some of this hidden history. Few people, for instance, would realise that the first attempt to commercialise the fax happened in the 1920s with both newspapers and radio vying for the rights to distribute news through this medium (358). By 1948 the US government had legislated to legalise 'multiplexing', where both an audio signal and an illustrative fax could be broadcast on a single channel. In other words, radio stations began to send printed pictures with their radio shows (364). Light, after Payne, suggests some of the uses: “As listeners enjoyed a foreign opera, translations of the music, or the score, could be sent to their living rooms; coupons might be faxcast as product promotions played” (364). Also in 1948, RCA, Eastman Kodak and NBC released the 'Ultrafax'. “This machine—a wireless Internet of sorts for mid-century—combined radio relays with high speed film processing such that it could send not only documents and sound, but also film to remote locations... To showcase the innovation's ability 'to transmit at the speed of light—186,000 miles a second'—all 1047 pages of Gone with the Wind were delivered as a clock timed the process at two minutes, 20 seconds” (Light 365). The Digital Fax Facts It is clear from Light's paper that the current image of the fax machine as beige, paper-jammed nuisance has been imposed in the wake of the digital fax machine and superseding Internet technologies. From this perspective, Jonathon Coopersmith's article, Facsimile's false starts, written in February 1993 at the height of digital fax popularity, and a mere three months before the World Wide Web came on line en mass, makes fascinating reading (Ó Dochartaigh 14). Coopersmith posits the digital fax machine at the forefront of technological advancement. In a statement, the tenor of which was to echo globally less than a decade later about the Internet, he triumphs his subject: “Imagine life without it. From production to politics, from delis to Delhi, fax machines have everywhere transformed how people communicate and have shrunk the world by simplifying and accelerating the flow of information” (46). In championing the digital fax (for he does not hide his enthusiasm) Coopersmith paints a very different picture from Light's quirky history of the analog fax. The moment of writing is the moment of perfection, to be savoured and sentimentalised: “By the end of 1991 ... 6 million fax machines hummed and whirred within the 50 states” (compare this with the language of the early twentieth century Futurists: “...gluttonous railway stations devouring smoking serpents; factories suspended from the clouds by the thread of their smoke” [Marinetti] and late twentieth century post-digital artists such as Kim Cascone who famously wrote about “computer fans whirring, laser printers churning out documents, the sonification of user-interfaces, and the muffled noise of hard drives” [393]. This is this language of technology obsessives!). In an obvious case of rewiring history Coopersmith has buried Light's history. The digital fax is clearly 'superior technology' when compared to its analog predecessors. There is little mention made of its alternative uses (such as an illustrative device for radio), and the 'Ultrafax' that could send the entire text of Gone with the Wind in less than three minutes is nowhere to be seen. There are some references made to 'niche markets' such as newspaper broadcasting and weather map distribution, but these are reported as mere dead-end folly (47). Much of the article is focussed on the 'serious' analog machines that were produced from the late 60s onwards and their teleological evolution into the digital fax machine in the 1980s. There are two major flaws of these older machines according to Coopersmith: “They were slow by modern standards” (an analog fax machine made in the mid-seventies could send one page in 2-3 minutes, whereas its digital descendant meant the transmission of more than a page a minute by 1980) and had incompatible 'communications protocols' (48). Coopersmith has made a common mistake in recounting history: he has started in the present and worked backwards—he has begun with a set of closely defined parameters (such as fax speed and protocol compatibility) and searched for those traits in previous technology. He has 'rewired' the history of a piece of technology to match its current day usage. Everything else is either trivial or irrelevant. Light, on the other hand, may be accused of doing the reverse, of pulling out the unusual, the novel, and sensationalising what is really just a fairly ordinary piece of office equipment, with (on the whole) a fairly ordinary evolution. The difference is that, at this point in time, most of us are aware of what a fax machine does and can hazard a guess at its immediate past (it is not rocket science to work deduce that there were once slower, analog machines). Light draws our attention to technologies that have been discarded along the way that may (or may not) have led to interesting developments (it was suggested, for instance, that the Ultrafax could transmit movies to cinemas across the country [366]) if convenience, commerce or some other historical anomaly hadn't written them out of existence. There is another sign that Coopersmith is not being totally honest with himself or with us. Keeping in mind that he is writing this article only three months before the Internet becomes publicly accessible, time and again the author derides this impending juggernaut, in defence of his beloved fax: “Predictions in the 1970s and 1980s about the impending automated office dismissed the [fax] because it did not fit the image of an electronic, paperless future” (48), “When was the last time you pulled into a gas station to send an e-mail message?,” he asks rhetorically (48) (prophetic, considering the places e-mail booths can be found now), “... anyone who can photocopy and make a telephone call can use a fax machine. Compare this ease with the investment, personal as well as financial, in learning how to use electronic mail...,” (46) and “Universal communication now exists, a goal still sought in e-mail.” (48) Coopersmith was writing in a era when time could be said to be speeding up, through an increase in the annihilation of space by time (the beginning of the Internet revolution). David Harvey (after Gurvitch) cites a type of time that is cutthroat, competitive and speculative, where the future becomes the present (224-5). Coopersmith has fallen victim to this phenomenon, Time in advance of itself, finding himself longing for something that has yet ceased to exist; becoming sentimental in defense of a present he fears will soon pass in an affliction that I have previously termed hyper-sentimentalisation. This phenomenon is often accompanied by a rosy picture of your precious present projected into the future. While Coopersmith is careful not to get too carried away (all the talk of the Internet must have made him a little wary) he does think that the fax machine, “has become an integral part of today's office [and] will be around tomorrow” (48) and predicts that faster, colour machines will be available into the next century (perhaps technically a statement of fact, but it is also true that the fax, excepting legacy uses, has now been relegated back to niche markets) (49). I chose these two articles, not because I find the fax machine necessarily the most interesting piece of equipment, but because they demonstrate quite precisely some of the endemic problems of writing about the history of technology (indeed the mainstream perception of technology throughout history). This discursive analysis also hints at the larger problems associated with analog and digital histories (and the breakup of technologies into this dichotomy). The Analog/Digital Dialectic In the same way that the history of analog fax machines has been rewired to suit a digital present, so has the entire history of the analog been subsumed (on the whole) by the digital (with the same two exceptions as above—the fetishised and the sentimentalised). The origin of the digital computer may be traced back hundreds, if not thousands of years; from before Charles Babbage's 19th century Differentiation Machine, Pascal's 17th century adding machine, and the ancient Chinese abacus (Aikat 56). Digital technology has been around since humankind developed the ability to externalise mathematics (analog technology is, by default, older, as the digital requires the manipulation of abstract concepts). Our story, however, starts in the mid 1940s, and the clash of two paradigms. The debate about the power of analog versus digital computation came to a head in 1943 at the Josiah Macy Jr. scientific conference where Gerard and Bateson defended the old and McCulloch, Pitts and von Neumann paraded (what was seen to be) the new (Siegelmann x). A decade before, Alan Turing's conceptual machine had provided the theoretical framework for this debate. Turing's imaginary device had all the essential components of today's computers—tape and reader (input-output devices), control mechanism (central processing unit) and control mechanism's storage (memory) (Aikat 64). In this model the tape is theoretically infinite, meaning that any algorithmic calculation may be performed to as many decimal places as required (at least in theory) (Bains). Another appeal of the conceptual Turing machine is that it was considered to be 'universal', meaning that one machine could be programmed to perform any process imaginable. (Aikat 65). The power and simplicity of this model made it too enticing to ignore, and following the 1943 conference, the digital approach dominated (Siegelmann x). Two questions spring to mind from this history as I have recounted it. Firstly, could there have been anything worth defending within the analog, or were Gerard and Bateson merely being conservative or sentimental? Secondly, although this was a definite turning point away from the analog, to what extent is our digital present exaggerating this moment? Is it possible to extract other histories, other than digital ones (or ones that don't fit into this dichotomy at all) from then till now that have been buried within the cult of the new (or the now)? Does the Analog Have Anything Left to Offer?A recent book by Siegelmann makes some valid arguments for the defence of analog technology (Siegelmann). As a work of abstract mathematics, Siegelmann's thesis is, at times, impenetrable to the mere mortal, but the inherent problems in digital technology, and subsequent benefits of analog technology aren't difficult to understand. The benefits of digital technology are well understood and have been exploited exponentially over the last 30 years. The particular digital model proposed by Turing, refined in subsequent decades, and then commercialised by such companies as IBM and Intel is, almost without exception, the only model left standing. In the 40s analog computing was as viable as digital (perhaps more so), but by the 50s digital technology had become the immediate future, at least for data processing (it was still inconceivable that photography, sound recording and all tele-communication would one day be performed on the same machines that could process census data). By the 60s and 70s the sky was considered the limit, and by the 80s, the sky had been (supposedly) left for dead. Each of these decades saw amazing new developments in digital technology, but each also had to sacrifice technology that did not fit the model, for no other reason than it did not fit the model. There are of course often valid reasons for sending obsolete pieces of technology out to pastures. But quite often it is mere chance, luck, or commercial expediency, often done in the name of compatibility (if everyone is using the same piece of technology, or a technology that has closed protocols—protocols shared by all similar pieces of technology—then there is a greater shared market. This type of occurrence should probably forever be known as the 'VHS over Beta' phenomenon). The story of how Bill Gates and the fledgling Microsoft gained a near global monopoly on computer operating systems is one tale comprising recent cyber-lore that had little to do with the substance or quality of the product and everything to do with opportunity and cunning. Gates was originally hired by IBM to provide a BASIC language program for its new PCs, but managed to talk them into purchasing an entire operating system, which Microsoft did not yet have. Gates managed to purchase a hastily created clone of another operating system and licence it to IBM without either party finding out and the rest is history. Many computer professionals are bitter to this day about how, what they consider to be an inferior product, conquered the market (Orlowski). Even our immediate digital history is littered with debris. Most recently Apple shelved the use of Motorola integrated circuits in favour of the now all-pervasive Intel chipset (Grant & Meadows 163). And according to a recent article in the Sydney Morning Herald, “the latest [Intel] Macs are built using the exact same components as a Windows PC” (Flynn 4). There are now no mainstream computers in production that don't use an Intel (or Intel-like) central processing unit. The early eighties saw the proliferation of small, comparatively cheap personal computers. From the homey Vic 20 to the Apple Mac, from the minute D.I.Y. ZX-80 to the 'serious' IBM XT and AT models. There must been a thousand different machines, all with different specifications vying for a wide range of markets, from the high school hobbyist with a bit of pocket money to spend, to major corporations with million dollar budgets [see Allan & Laing]). By the mid 90s there were two outfits left standing, Intel driven machines (also known as IMB compatibles, or just PCs) and Apple Macs (and for a time, Apple Mac compatibles). This was despite there existing machines that were considered technically superior (such as the first multimedia and multitasking computer, the Amiga which many early digital video and image editors swore by. The company that owned the Amiga at the time, Commodore went broke in 1993, all but burying it with their demise [Laing 166-169]). Even in these most recent of times there is an enormous amount of obfuscated technological history. Most computer users are unaware of the Amiga or the myriad of other computer-types that now comprise landfill (note that I haven't even considered operating systems, software or peripherals in my account). Analog technology is considered to be everything that came before the 'digital revolution' (this categorisation, itself, is problematic, but more about that later). It is today widely believed that analog has been superseded, but this is not quite true. Although (as stated before) digital technology is theoretically capable of determining any equation, practically it is not. Non-linear equations are usually held up as the shining example of this, but for those of us who have forgotten our senior high mathematics there are a couple of very digestible examples. We all know that pi is equal to 3.142 (etc. etc. etc.). Most digital computers could do a much better job of getting closer to pi than you or I, but none is capable of calculating it exactly. Why does this matter? It means that a digital computer is incapable of making precise use of pi, meaning that it can never theorise a precise circle. Again, why should this matter? In most applications it shouldn't. If you were trying to use pi to determine the precise orbit of Pluto, or even long-term trajectories of ocean currents, you may run into trouble. An even simpler example, is the calculation of one third. One divided by three on a calculator or computer will give us an answer of .3333333 (etc. etc. etc... the threes going on infinitely, at least, theoretically). Again, the division of a precise third becomes impossible for digital computers, and again, it can be said that ‘extremely small errors in initial conditions can grow to influence behavior on a large scale’ (Bains). These sorts of simple equations, along with non-linear equations and 'real-life' situations that are more chaotic than most modern-day computers can handle, have, in the past, been simulated by purpose-built analog computers that employ materials and systems that are analogous to real-life situations. A simple example of one of these analogous situations may be how adding cream to coffee, creating swirling patterns, simulates weather patterns. As Siegelmann explains: ‘chaotic systems … are inherently analog and are sensitive to minute changes in their parameter values. Digital computers can only approximate chaotic systems up to a certain point in time, according to the precision provided for the initial description’ (xi). ‘Real-life’ situations are prone to minute, continuous and infinite changes in variables which digital computers are simply not made to keep up with. According to Siegelmann; ‘algorithms that are based on real numbers and act in a continuous domain not only provide a speedup but also describe phenomena that cannot be mimicked by digital models’ (xi). In other words, a digital computer is comprised of static memory that can only change in set increments, and can only be changed by a dynamic central processing unit. If many (perhaps an infinite amount of) variables are continually changing along an infinite continuum (variables do not naturally increase in set increments), a digital computer will only awkwardly approximate outcomes. Adding Up the Digital Now to the second question, of whether our digital present has exaggerated defining moments in history (like the 1943 conference), and if so, what else can be made of it. In his article, C. Bisell, quotes Paul Edwards in calling the analog computer “one of the great disappearing acts of the twentieth century” (1). According to Bissell “...the commonly accepted view that analogue computers had a short flowering, and soon fell into decline with the advent of the digital computer, is a highly inaccurate historical reading” (4). He cites a number of contested areas concerning analog computers that kept them viable until at least the seventies (and in niche situations, into the present), the most striking of these being speed: while digital computers were undoubtedly faster at repeated mathematical calculations, analog computers were often much speedier at practical simulations (5). He quotes the following example from Small (11): “At Project Cyclone digital computers were [...] used to verify the accuracy of results from the analog computer. [...] In the simulation of a guided missile in three dimensions, the average runtime on the analog computer facility was approximately one minute. The check solution by numerical methods [i.e. digital methods...] took from 60 to 130 hours to solve the same problem”. Even today, the speed of this analog computer would be competitive, speed-wise with, and possibly much more efficient than its digital counterpart. More recently, in the early nineties, the CNN (Cellular Neural Network) Universal Machine and Supercomputer claimed to be the “first algorithmically programmable analog array computer with its own language and operating system and which has the computing power of a supercomputer on a single chip” (Roska 163-4). Although the information available for this 'supercomputer' appears limited to the highly technical (which is a problem when attempting to comprehend a history and write a historiography), I have managed to glean what are considered its merits over what were the then current digital computers. By employing principles which “the living systems of nature have provided us” (something that only analog computers are truly capable of without resorting to bitmapped simulations) the CNN Universal Machine and Supercomputer is able to perform many complex neural functions 'without spikes', that is, in a way that is nuanced, not binary and with potentially infinite possibilities. It can make use of a type of topographical mapping of the visual (termed 'retinotopically') akin to the brain's continuous, 'multiscreen' interpretation of eyesight (this appears to differ from digital in that it is continuous and not staged and staggered). Existing physical phenomena can be translated and imported into the computer as complex cells (again, presumably, without using simulated bitmaps). And by employing a continuous realm the containment and manipulation of data is much more efficient and faster (there is a seemingly-convincing comparison done between this chip and an Intel 80170 that implies this, written in techno-babble, that, for the purposes of this paper, I am willing to accept at face value) (Roska 163-4). I have spent some time extolling the virtues of some of these analog machines, not because I'm advocating we add to the e-waste crisis by hurling our current machines out the window (for a start, this article would then never see the light of day), but to demonstrate a general quirk of history, and specifically our recent technological history, illustrated nicely by the previous example; despite there existing an analog supercomputer in the early 90s, digital technology exists as superior technology in our collective, popular consciousness, and history will reflect this perception. Disputing the Analog/Digital Dichotomy Finally, in a cruel twist that I have saved to the bitter end I would like to contest the dichotomous terminology used throughout this paper as being contingent on historical usage. Analog and Digital are not terms set in concrete. Their meanings have changed with every new wave of technology. Wikipedia has become the vessel for current accumulated popular knowledge (thus representing a contemporary colloquial and collective understanding of terminology) so I have begun by looking at their account. Curiously, the first thing I notice is that if you search for 'digital computer' you get redirected to computer, whereas 'analog computer' has its own entry (that is, the term 'digital computer' appears to have become a tautology because all computers, except for some historical relics, are supposedly digital). There is a vague definition for 'computer' (being “a machine that manipulates data according to a list of instructions”) and then a (digital) history from the mid-forties onwards (Wikipedia, computer). The entry, 'digital' provides the definition we would normally expect from 'digital computer', being that it “uses discrete (that is, discontinuous) values to represent information for input, processing, transmission, storage, etc.” (Wikipedia, digital computer). 'Analog Computer' is defined as a “form of computer that uses continuous physical phenomena such as electrical, mechanical, or hydraulic quantities to model the problem being solved” (Wikipedia, analog). These entries, like just about all definitions since the late seventies (since the rise of our modern Intel-driven existences) have used the terms 'discrete' to describe the digital and 'continuous', the analog. Take another (from the charismatically-scribed work, TechGnosis by Erik Davis): “Analog gadgets reproduce signals in continuous, variable waves of real energy, while digital devices recode information into discrete symbolic chunks” (6). And just for good measure, a third from the late seventies: “In a digital computer information is represented by discrete elements and the computer progresses through a series of discrete states. In an analogue computer information is represented by continuous quantities and the computer processes information continuously” (Moor 217). While 'discrete' and 'continuous' may go some way to differentiating between digital and analog, they are fairly abstract and do not tell the whole story (unqualified, they can also be misleading, or even contradictory. Discrete has a dual meaning and it could be said that a program on a digital computer can run continuously). Quite often this definition is left unexplained by authors as an undisputed technical explanation, quickly followed by a more descriptive and colloquial one that describes style more than substance. Davis, for instance, goes on to say in post-digital Californian that “The analog world sticks to the grooves of the soul—warm, undulating, worn with the pops and scratches of material history. The digital world boots up the cool matrix of the spirit: luminous abstract, more code than corporeality” (6-7). The decontextualisation and parroting of terms from the last tectonic dialectical conversation involving this technology has led to an anesthetising of meaning. Whenever digital is described as 'discrete' and analog, 'continuous', it is taken for granted that these definitions are correct because of their prior usage. They are not interrogated as they should be, but this has not always been the case. The seventies was the decade that digital technology practically took hold. It is not surprising then that substantial debate as to its meaning, and consequently, the retrospective understanding of recent technological history (and, yes, its rewiring) was taking place. In an article entitled “Three Myths of Computer Science”, James Moor (after defining Digital and Analog as per ab

Referência(s)
Altmetric
PlumX