Digital Media and Society
eBook - ePub

Digital Media and Society

An Introduction

  1. English
  2. ePUB (mobile friendly)
  3. Available on iOS & Android
eBook - ePub

Digital Media and Society

An Introduction

Book details
Book preview
Table of contents
Citations

About This Book

The rise of digital media has been widely regarded as transforming the nature of our social experience in the twenty-first century. The speed with which new forms of connectivity and communication are being incorporated into our everyday lives often gives us little time to stop and consider the social implications of those practices. Nonetheless, it is critically important that we do so, and this sociological introduction to the field of digital technologies is intended to enable a deeper understanding of their prominent role in everyday life.

The fundamental theoretical and ethical debates on the sociology of the digital media are presented in accessible summaries, ranging from economy and technology to criminology and sexuality. Key theoretical paradigms are explored through a broad range of contemporary social phenomena – from social networking and virtual lives to the rise of cybercrime and identity theft, from the utopian ideals of virtual democracy to the Orwellian nightmare of the surveillance society, from the free software movement to the implications of online shopping.

As an entry-level pathway for students in sociology, media, communications and cultural studies, the aim of this work is to situate the rise of digital media within the context of a complex and rapidly changing world.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Digital Media and Society by Adrian Athique in PDF and/or ePUB format, as well as other popular books in Social Sciences & Media Studies. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Polity
Year
2013
ISBN
9780745680668
Edition
1
Part I
Digital Histories
CHAPTER 1
Building a Digital Society
It is important that many of the structural possibilities of a digital society were foreseen in the early days of electronic computing (and even before). Others were either unanticipated or came more from the pages of fantasy novels than from the rational projection of technologists. Parallel processes were in play. This first chapter casts its eye firmly on the futurism of the past, specifically from the advent of modern computer science in 1936 to the crisis of undue complexity in 2007. In this brief history, I will recall how digital technologies, through a virtuous circle of mediation, came to reinforce and implement particular ways of conceptualizing and ordering society in the late twentieth century. As the application of digital technologies gathered pace from the 1970s onwards, the various imperatives driving technological change were also reshaping the dominant political structures in developed and developing societies. Because of the apparent confluence of technical and social change, it has become commonplace to argue that the impact of digital technologies is comparable to that of the Industrial Revolution (Castells 1996). An ‘information revolution’ is seen to mark an irreversible transition from physical to intangible (untouchable) commodities and actions, and from embodied to mediated social processes (Leadbeater 2000). More cautious scholars have argued that it is vitally important to recognize that there is a crucial element of continuity with the previous ‘industrial’ age in terms of a progressive technical automation of human functions and social organization (Webster 2006). In that respect, the digital media have a substantial and fascinating history, and it is worth situating these arguments against the past evolution of digital society.
Information Society and the Atomic Age
The digital technology of today finds its origins in the mechanical calculating machines invented during the nineteenth century. The ‘analytical engine’ conceived by Charles Babbage in the 1830s was the first machine intended to process and store information with multiple purposes and flexible configurations of usage (Swade 2001). A prototype of this machine was completed by the beginning of the twentieth century. The storage of information records on ‘punched’ cards was also an innovation of the nineteenth century that would later become a key component of computing in the twentieth century. In 1936, Alan Turing introduced the concept of the ‘Turing machine’, a device that would make calculations based upon a large store of printed information which could be selectively applied for mathematical processing (Petzold 2008). Turing took this concept further to demonstrate the idea of a ‘universal machine’ that could read the description of any computational process (an ‘algorithm’) and then simulate its operation. Turing was one of the most significant figures in modern mathematics and as a consequence, following the outbreak of the Second World War, he was recruited to work at Britain’s secret code-breaking centre at Bletchley Park. Turing famously devised the ‘bombe’ machine in order to decipher the secret codes produced by the German cryptological machine (the ‘enigma’) (Copeland 2004).
The global conflagration that killed between 50 and 70 million people in the mid twentieth century occurred on the cusp of several major scientific breakthroughs, including not only computational machines, but also modern electronics and nuclear physics. In that respect, the war years (1939–45) were as much a scientific and technological contest as they were a military one. The most technologically advanced nations in the world, Britain, the United States and Germany, effectively conscripted their scientific talents and applied them relentlessly to military applications, culminating in the advent of computers, missiles and the atomic bomb in the 1940s. It is in that context that Konrad Zuse developed in 1941 the first programmable machine operated through information stored in binary code. The United States built the first electronic computer in 1941 and Britain developed an electronic device with limited programmability (the ‘colossus’) in 1943 (Copeland et al. 2006). In 1942, Britain took the momentous decision to share all of its scientific secrets with the United States, and the collaboration between the two countries enabled them to surpass Germany in the fields of logical computing and atomic weaponry. Needless to say, the atomic bomb, and its use against Japan in 1945, was an epochal moment in human history. The significance of the emergence of modern computer science, however, was kept under tight secrecy, and did not become fully apparent until a number of years after the war.
The United States Army built the ENIAC device in 1946 to aid in the successful delivery of missile weapons, whilst Britain built the first programmable electronic computers (the ‘Manchester computers’) between 1948 and 1950. Accordingly, the pursuit of electronic computing – in primitive but strategically important forms – by the major antagonists during the Second World War in the 1940s is commonly seen as heralding what has been called the ‘information age’. The conflict had brought together large number of scientists, academics and technicians on an unprecedented scale and had demonstrated how major technical achievements could be made quickly through such systematic collaboration. It was this experience that underpinned the decision to massively expand university and technical education in the post-war decades. In making his assessment of these developments for the future, Vannevar Bush, the director of the Federal Office of Scientific Research and Development in the United States, wrote an essay in 1945 in which he reflected on the growing specialization of knowledge and the new tools for managing information that would become essential in the post-war world. Famously, Bush projected the imminent arrival of a desktop information management machine that he called the ‘Memex’. The Memex would facilitate the storage, retrieval and, most critically, the linkage of information customizable to the needs of each user.
A Memex is a device in which an individual stores all his books, records and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory. It consists of a desk, and while it can presumably be operated from a distance, it is primarily the piece of furniture at which he works. On the top are slanting translucent screens, on which material can be projected for convenient reading. There is a keyboard, and sets of buttons and levers. Otherwise it looks like an ordinary desk. In one end is the stored material. The matter of bulk is well taken care of by improved microfilm. Only a small part of the interior of the Memex is devoted to storage, the rest to mechanism. Yet if the user inserted 5,000 pages of material a day it would take him hundreds of years to fill the repository, so he can profligate and enter material freely.
Vannevar Bush (1945) ‘As We May Think’, Atlantic Monthly, 176(1): 101–8
The development of ‘mainframe’ computers in the 1950s and 1960s produced rapid leaps in the application of electronic computing to solving advanced mathematical problems. These machines were far from the desk-based device envisioned by Vannevar Bush, commonly taking up the size of an entire room or more. Mainframes required a massive amount of power and a large team to maintain and operate. Nonetheless, the energies spent upon the development of these machines stemmed from a widespread recognition that the concentration of information in forms that could be processed in any number of ways would open up enormous potentials for scientific development. Computerization would simultaneously solve the problem of memorizing and managing all that information. The speed of electronic processing promised to overcome the time- and scale-based limitations of human thinking. This step-change in efficiency could obviously be applied to scientific experiments, but also to any number of large and complex processes employed in military, bureaucratic and manufacturing applications. ‘Information management’ would no longer be a technique of making and maintaining records, but rather a dynamic process of experimentation that employed digitized records (‘data’) as its raw material.
Cold War and White Heat
The 1950s and 1960s were characterized by the onset of the ‘Cold War’, a period in which the wartimes allies of the capitalist West and communist East were pitted against each other in an intense scientific and technological contest to master the new technologies of the age. These (dangerous) rivalries were also expressed in their respective desire to demonstrate the supremacy of their opposing economic systems. As such, the potential of computing to improve the efficiency of industrial production was quickly recognized both by state-owned enterprises in the communist bloc and the private industrial corporations of the Western world, in which the United States had now become predominant. The pursuit of ‘information technology’ was intended to transform the productive process of global industry, with this modernization furnishing a capacity to rapidly develop and commercialize any number of new technologies. In 1963, the British Prime Minister, Harold Wilson, referred to the ‘white heat’ of a technological age. The focus of commercial competition was therefore shifting from territorial expansion to the pursuit of more efficient industries and markets via rapid automation. Three years before, US President Dwight D. Eisenhower had already spoken of the new institutional form of scientific research and its co-evolution with what he called the ‘military-industrial complex’ (1961).
It was the machinery of ‘high technology’ that caught the public imagination in the 1960s, via the ‘space race’, nuclear power and the domestication of electronics (notably television). The new centrality of information management, however, subsequently proved to be an equally profound development in the remaking of the modern world. By the 1970s we had entered an era in which vast stores of information appeared to hold greater significance than large volumes of physical resources. All forms of human processes, concepts and activities were being recorded as data that could, in turn, be applied and improved by the machinery of information technology. The perceived outcome of ‘computerization’ was that electronic calculation could increase both the scale and speed of almost any process repeatedly and infinitely. Thus, it was not simply the capacity of the computer to hold inconceivable amounts of information, but its programmatic capacity to select the right bits of data and process them in new combinations that was permanently revolutionary. In the process, computerization promised to make the conduct of almost any complex undertaking vastly more efficient. The Cold War militaries were playing with nukes, a dangerous game that required automation and the elimination of human error. Military applications thus took primacy in the paranoia of the 1950s, but by the end of the 1960s computer processing was also applied with enthusiasm to all the institutions of modern life. Universities, the traditional storehouses of information, were at the forefront of this process and were influential advocates of computerization.
The benefits of greater speed and efficiency in information processing were also immediately obvious to the various branches of modern government and to the commercial corporations of the day. Sociology, as a putative science of social organization and systematic observer of human behaviour, was a natural partner in the process of ‘informationalization’. As a technology of record, the more information that could be collected about every aspect of society, the more efficiently society could be assessed and managed. Equally, there were clear commercial benefits in knowing more about the habits of consumption in mass society. Ever more complex industrial processes became conceivable, making automated mass production bigger, faster and more innovative. It is fair to say, then, that computerization in the era of the mainframe was overwhelmingly corporate in scale and instrumental in purpose. The possession of data and the means to process it was intended to confer advantages that were inherently competitive in intent and managerial in flavour. Efficiency became the watchword of the day. In order to achieve these aims, it was necessary in the first instance to put computer technology to work on itself, rapidly advancing the technology and automating the process of its own development. Thus, information about computing itself became a major objective of scientific research, and algorithms for the computation of information (the ‘software’) became a major constituent in the development of the electronic components (the ‘hardware’) intended to carry out those functions.
It was, without a doubt, the United States that led the charge in computerization. Their Soviet adversaries made huge efforts in developing their own mainframe systems, while Britain, with its diminishing resources, struggled to keep up. Other European countries pooled resources to stay in the ‘high-technology’ game. In the developing world, India was keen to commit to the development of computer science, while Japan (under the post-war tutelage of the United States) pursued this new technological and managerial paradigm with unparalleled enthusiasm. As a consequence, it is relatively unsurprising that when the information age was given structural expression in the model of an ‘information society’, this took the form of a political economy of which the major advocates were American (Drucker 1959; Bell 1973) and Japanese (Masuda 1990). This emerging perspective on social structure has seen many different iterations in theory, but the hallmark of information society theory is social organization via data processing. In Robert Hassan’s definition: ‘At the broadest level of conceptualization, we can begin by saying that the information society is the successor to the industrial society. Information, in the form of ideas, concepts, innovation and run-of-the-mill data on every imaginable subject – and replicated as digital bits and bytes through computerization – has replaced labour and the relatively static logic of fixed plant as the central organizing logic of society’ (2008: 23).
Box 1.1 Characteristics of an information society
• Knowledge displaces skills – fundamental importance of guiding processes over physical actions
• Mechanical archives – complete automation of informational processes
• Social life as data – unprecedented collection and collation of information on human activity
• Purposeful knowledge – value is extracted from the application of information rather than its meaning or essence
• Continuous innovation – configuration of data in new forms becomes the basis of knowledge production
• Competitive velocity – the accelerated speed and efficiency of information techniques constitute an advantage in all fields of activity
• Exponential change – The primary goal of the ‘information revolution’ is the total transformation of human affairs
The Flowering of Electronic Revolutions
The co-evolution of computerization and electronics was marked by a series of important hardware developments. The biggest breakthroughs were the transistor and the subsequent development of the integrated circuit (silicon chip), which allowed for the photolithographic production of millions of tiny transistors in cheap, powerful and extremely small computer processors. Many of these innovations had far wider applications than the rapid upscaling of computing machines. The post-war decades were also the era in which the Western democracies pursued the dream of a society in which productive efficiency put mass consumption at the heart of everyday life. As such, the fruits of electronic technology beca...

Table of contents

  1. Cover
  2. Halftitle
  3. Dedication
  4. Title Page
  5. Copyright Page
  6. Contents
  7. List of Figures and Boxes
  8. Acknowledgements
  9. Introduction
  10. Part I: Digital Histories
  11. Part II: Digital Individuals
  12. Part III: Digital Economies
  13. Part IV: Digital Authorities
  14. Bibliography
  15. Index