Part I
Digital Histories
CHAPTER 1
Building a Digital Society
It is important that many of the structural possibilities of a digital society were foreseen in the early days of electronic computing (and even before). Others were either unanticipated or came more from the pages of fantasy novels than from the rational projection of technologists. Parallel processes were in play. This first chapter casts its eye firmly on the futurism of the past, specifically from the advent of modern computer science in 1936 to the crisis of undue complexity in 2007. In this brief history, I will recall how digital technologies, through a virtuous circle of mediation, came to reinforce and implement particular ways of conceptualizing and ordering society in the late twentieth century. As the application of digital technologies gathered pace from the 1970s onwards, the various imperatives driving technological change were also reshaping the dominant political structures in developed and developing societies. Because of the apparent confluence of technical and social change, it has become commonplace to argue that the impact of digital technologies is comparable to that of the Industrial Revolution (Castells 1996). An âinformation revolutionâ is seen to mark an irreversible transition from physical to intangible (untouchable) commodities and actions, and from embodied to mediated social processes (Leadbeater 2000). More cautious scholars have argued that it is vitally important to recognize that there is a crucial element of continuity with the previous âindustrialâ age in terms of a progressive technical automation of human functions and social organization (Webster 2006). In that respect, the digital media have a substantial and fascinating history, and it is worth situating these arguments against the past evolution of digital society.
Information Society and the Atomic Age
The digital technology of today finds its origins in the mechanical calculating machines invented during the nineteenth century. The âanalytical engineâ conceived by Charles Babbage in the 1830s was the first machine intended to process and store information with multiple purposes and flexible configurations of usage (Swade 2001). A prototype of this machine was completed by the beginning of the twentieth century. The storage of information records on âpunchedâ cards was also an innovation of the nineteenth century that would later become a key component of computing in the twentieth century. In 1936, Alan Turing introduced the concept of the âTuring machineâ, a device that would make calculations based upon a large store of printed information which could be selectively applied for mathematical processing (Petzold 2008). Turing took this concept further to demonstrate the idea of a âuniversal machineâ that could read the description of any computational process (an âalgorithmâ) and then simulate its operation. Turing was one of the most significant figures in modern mathematics and as a consequence, following the outbreak of the Second World War, he was recruited to work at Britainâs secret code-breaking centre at Bletchley Park. Turing famously devised the âbombeâ machine in order to decipher the secret codes produced by the German cryptological machine (the âenigmaâ) (Copeland 2004).
The global conflagration that killed between 50 and 70 million people in the mid twentieth century occurred on the cusp of several major scientific breakthroughs, including not only computational machines, but also modern electronics and nuclear physics. In that respect, the war years (1939â45) were as much a scientific and technological contest as they were a military one. The most technologically advanced nations in the world, Britain, the United States and Germany, effectively conscripted their scientific talents and applied them relentlessly to military applications, culminating in the advent of computers, missiles and the atomic bomb in the 1940s. It is in that context that Konrad Zuse developed in 1941 the first programmable machine operated through information stored in binary code. The United States built the first electronic computer in 1941 and Britain developed an electronic device with limited programmability (the âcolossusâ) in 1943 (Copeland et al. 2006). In 1942, Britain took the momentous decision to share all of its scientific secrets with the United States, and the collaboration between the two countries enabled them to surpass Germany in the fields of logical computing and atomic weaponry. Needless to say, the atomic bomb, and its use against Japan in 1945, was an epochal moment in human history. The significance of the emergence of modern computer science, however, was kept under tight secrecy, and did not become fully apparent until a number of years after the war.
The United States Army built the ENIAC device in 1946 to aid in the successful delivery of missile weapons, whilst Britain built the first programmable electronic computers (the âManchester computersâ) between 1948 and 1950. Accordingly, the pursuit of electronic computing â in primitive but strategically important forms â by the major antagonists during the Second World War in the 1940s is commonly seen as heralding what has been called the âinformation ageâ. The conflict had brought together large number of scientists, academics and technicians on an unprecedented scale and had demonstrated how major technical achievements could be made quickly through such systematic collaboration. It was this experience that underpinned the decision to massively expand university and technical education in the post-war decades. In making his assessment of these developments for the future, Vannevar Bush, the director of the Federal Office of Scientific Research and Development in the United States, wrote an essay in 1945 in which he reflected on the growing specialization of knowledge and the new tools for managing information that would become essential in the post-war world. Famously, Bush projected the imminent arrival of a desktop information management machine that he called the âMemexâ. The Memex would facilitate the storage, retrieval and, most critically, the linkage of information customizable to the needs of each user.
A Memex is a device in which an individual stores all his books, records and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory. It consists of a desk, and while it can presumably be operated from a distance, it is primarily the piece of furniture at which he works. On the top are slanting translucent screens, on which material can be projected for convenient reading. There is a keyboard, and sets of buttons and levers. Otherwise it looks like an ordinary desk. In one end is the stored material. The matter of bulk is well taken care of by improved microfilm. Only a small part of the interior of the Memex is devoted to storage, the rest to mechanism. Yet if the user inserted 5,000 pages of material a day it would take him hundreds of years to fill the repository, so he can profligate and enter material freely.
Vannevar Bush (1945) âAs We May Thinkâ, Atlantic Monthly, 176(1): 101â8
The development of âmainframeâ computers in the 1950s and 1960s produced rapid leaps in the application of electronic computing to solving advanced mathematical problems. These machines were far from the desk-based device envisioned by Vannevar Bush, commonly taking up the size of an entire room or more. Mainframes required a massive amount of power and a large team to maintain and operate. Nonetheless, the energies spent upon the development of these machines stemmed from a widespread recognition that the concentration of information in forms that could be processed in any number of ways would open up enormous potentials for scientific development. Computerization would simultaneously solve the problem of memorizing and managing all that information. The speed of electronic processing promised to overcome the time- and scale-based limitations of human thinking. This step-change in efficiency could obviously be applied to scientific experiments, but also to any number of large and complex processes employed in military, bureaucratic and manufacturing applications. âInformation managementâ would no longer be a technique of making and maintaining records, but rather a dynamic process of experimentation that employed digitized records (âdataâ) as its raw material.
Cold War and White Heat
The 1950s and 1960s were characterized by the onset of the âCold Warâ, a period in which the wartimes allies of the capitalist West and communist East were pitted against each other in an intense scientific and technological contest to master the new technologies of the age. These (dangerous) rivalries were also expressed in their respective desire to demonstrate the supremacy of their opposing economic systems. As such, the potential of computing to improve the efficiency of industrial production was quickly recognized both by state-owned enterprises in the communist bloc and the private industrial corporations of the Western world, in which the United States had now become predominant. The pursuit of âinformation technologyâ was intended to transform the productive process of global industry, with this modernization furnishing a capacity to rapidly develop and commercialize any number of new technologies. In 1963, the British Prime Minister, Harold Wilson, referred to the âwhite heatâ of a technological age. The focus of commercial competition was therefore shifting from territorial expansion to the pursuit of more efficient industries and markets via rapid automation. Three years before, US President Dwight D. Eisenhower had already spoken of the new institutional form of scientific research and its co-evolution with what he called the âmilitary-industrial complexâ (1961).
It was the machinery of âhigh technologyâ that caught the public imagination in the 1960s, via the âspace raceâ, nuclear power and the domestication of electronics (notably television). The new centrality of information management, however, subsequently proved to be an equally profound development in the remaking of the modern world. By the 1970s we had entered an era in which vast stores of information appeared to hold greater significance than large volumes of physical resources. All forms of human processes, concepts and activities were being recorded as data that could, in turn, be applied and improved by the machinery of information technology. The perceived outcome of âcomputerizationâ was that electronic calculation could increase both the scale and speed of almost any process repeatedly and infinitely. Thus, it was not simply the capacity of the computer to hold inconceivable amounts of information, but its programmatic capacity to select the right bits of data and process them in new combinations that was permanently revolutionary. In the process, computerization promised to make the conduct of almost any complex undertaking vastly more efficient. The Cold War militaries were playing with nukes, a dangerous game that required automation and the elimination of human error. Military applications thus took primacy in the paranoia of the 1950s, but by the end of the 1960s computer processing was also applied with enthusiasm to all the institutions of modern life. Universities, the traditional storehouses of information, were at the forefront of this process and were influential advocates of computerization.
The benefits of greater speed and efficiency in information processing were also immediately obvious to the various branches of modern government and to the commercial corporations of the day. Sociology, as a putative science of social organization and systematic observer of human behaviour, was a natural partner in the process of âinformationalizationâ. As a technology of record, the more information that could be collected about every aspect of society, the more efficiently society could be assessed and managed. Equally, there were clear commercial benefits in knowing more about the habits of consumption in mass society. Ever more complex industrial processes became conceivable, making automated mass production bigger, faster and more innovative. It is fair to say, then, that computerization in the era of the mainframe was overwhelmingly corporate in scale and instrumental in purpose. The possession of data and the means to process it was intended to confer advantages that were inherently competitive in intent and managerial in flavour. Efficiency became the watchword of the day. In order to achieve these aims, it was necessary in the first instance to put computer technology to work on itself, rapidly advancing the technology and automating the process of its own development. Thus, information about computing itself became a major objective of scientific research, and algorithms for the computation of information (the âsoftwareâ) became a major constituent in the development of the electronic components (the âhardwareâ) intended to carry out those functions.
It was, without a doubt, the United States that led the charge in computerization. Their Soviet adversaries made huge efforts in developing their own mainframe systems, while Britain, with its diminishing resources, struggled to keep up. Other European countries pooled resources to stay in the âhigh-technologyâ game. In the developing world, India was keen to commit to the development of computer science, while Japan (under the post-war tutelage of the United States) pursued this new technological and managerial paradigm with unparalleled enthusiasm. As a consequence, it is relatively unsurprising that when the information age was given structural expression in the model of an âinformation societyâ, this took the form of a political economy of which the major advocates were American (Drucker 1959; Bell 1973) and Japanese (Masuda 1990). This emerging perspective on social structure has seen many different iterations in theory, but the hallmark of information society theory is social organization via data processing. In Robert Hassanâs definition: âAt the broadest level of conceptualization, we can begin by saying that the information society is the successor to the industrial society. Information, in the form of ideas, concepts, innovation and run-of-the-mill data on every imaginable subject â and replicated as digital bits and bytes through computerization â has replaced labour and the relatively static logic of fixed plant as the central organizing logic of societyâ (2008: 23).
Box 1.1 Characteristics of an information society
⢠Knowledge displaces skills â fundamental importance of guiding processes over physical actions
⢠Mechanical archives â complete automation of informational processes
⢠Social life as data â unprecedented collection and collation of information on human activity
⢠Purposeful knowledge â value is extracted from the application of information rather than its meaning or essence
⢠Continuous innovation â configuration of data in new forms becomes the basis of knowledge production
⢠Competitive velocity â the accelerated speed and efficiency of information techniques constitute an advantage in all fields of activity
⢠Exponential change â The primary goal of the âinformation revolutionâ is the total transformation of human affairs
The Flowering of Electronic Revolutions
The co-evolution of computerization and electronics was marked by a series of important hardware developments. The biggest breakthroughs were the transistor and the subsequent development of the integrated circuit (silicon chip), which allowed for the photolithographic production of millions of tiny transistors in cheap, powerful and extremely small computer processors. Many of these innovations had far wider applications than the rapid upscaling of computing machines. The post-war decades were also the era in which the Western democracies pursued the dream of a society in which productive efficiency put mass consumption at the heart of everyday life. As such, the fruits of electronic technology beca...