Risky Work Environments
eBook - ePub

Risky Work Environments

Reappraising Human Work Within Fallible Systems

Pascal Béguin, Christine Owen, Christine Owen

  1. 226 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Risky Work Environments

Reappraising Human Work Within Fallible Systems

Pascal Béguin, Christine Owen, Christine Owen

Book details
Book preview
Table of contents
Citations

About This Book

Risky Work Environments provides new insights into the multiple and dynamic trajectories of both near misses and mistakes in complex work environments, based on actual case examples. It also studies the interactions between various activity systems or work practices (design, maintenance, incident investigation, regulation, operation) and their consequences for operational performance. The role of rules and regulations is explored, considering the consequences of deviations and the limitations of enforced compliance. Further, the book explains how to search for, think about and act on information about vulnerability, near misses and mistakes in a way that emphasizes accountability in ways that are not punitive but instead responsible, innovative and provide opportunities for learning. Writing from different disciplines and theoretical perspectives, the contributors analyse working in risky environments which include air traffic control, offshore mining, chemical plants, neo-natal intensive care units, ship piloting and emergency call dispatch centres. In each chapter the authors present rich empirical data and their analyses illustrate a variety of ways in which, despite imperfect systems, safety and resilience is created in human action. In the chapters where the focus is on error or mistakes, the analysis undertaken reveals the logic of actions undertaken at the time as well as their constraints. The contributors are all active researchers within their disciplines and come from Australia, Finland, France, Norway and the Netherlands. The book will be of direct interest to safety scientists, researchers and scientists, as well as human factors practitioners working in complex technological systems.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Risky Work Environments an online PDF/ePUB?
Yes, you can access Risky Work Environments by Pascal Béguin, Christine Owen, Christine Owen in PDF and/or ePUB format, as well as other popular books in Technologie et ingénierie & Santé et sécurité au travail. We have over one million books available in our catalogue for you to explore.

Information

Publisher
CRC Press
Year
2017
ISBN
9781317062523
Chapter 1
Introduction: Shifting the Focus to Human Work within Complex Socio-technical Systems
Pascal Béguin, Christine Owen and Ger Wackers
It is likely that most researchers and practitioners operating within (that is, ‘with’ and ‘in’) complex technological systems would agree with the observation that humans contribute positively to mitigating risk. Indeed, there is a growing consensus that researching error alone cannot help us to fully understand how reliability in systems is accomplished. The contributors in this book show how such reliability is created in the ways in which humans work within their systems, positively taking into account the fallibility of those systems. This book is, therefore, about illustrating the ways in which reliability is accomplished most of the time, despite imperfect systems, technologies and organizational procedures.
The core question of this book is:
How do workers operating within risky environments and in imperfect systems positively work to mitigate those risks and what problems do they encounter in so doing?
The authors of this book contribute to how we can better understand safety in systems. They contribute to this theoretical development through their focus on the role of human work in the design of complex technological systems. They do this first, by analysing the positive role of human actions with those systems as well as in those systems. Second, they do so through developing better understandings of the ways in which humans accomplish work in those environments, despite their own or systemic limitations, even in the case of failure. In this way the contributors move beyond typical error analysis toward analyses of how human operators positively act to mitigate risk in everyday operations.
In this book, readers will find a rich body of data from empirical investigations in a range of industrial sectors, and will meet:
• ship pilots navigating a diversity of seagoing ships through the channels and between the islands in a coastal archipelago;
• rail traffic controllers trying but failing to avoid a frontal collision between two trains;
engineers working within the multi-billion dollar offshore mining industry and whose work we can better understand within the context of broader socio-economic and political influences;
• medical doctors and nurses in neonatal intensive care units who must find treatment trajectories in the face of uncertainty about what is wrong with the seriously ill and premature newborns in their care, and about what the long-term results may be;
• staff who work in an emergency medical communications centre and who constantly strive to integrate potentially conflicting internal and external goals and associated risks to ensure ‘acceptable’ operational performance;
• designers involved in developing artefacts to avoid runaway chemical spills, who face the challenge of better understanding the ways in which the tools they develop are redesigned in practice by users;
• air traffic controllers who juggle the temporal demands of complex work within the constraints and opportunities of technologically mediated interdependent work.
Two arguments are central to developing a more holistic understanding of human work in complex systems. The first is that systems are fallible and that we need to move the focus away from human error to the dynamics of systems and vulnerabilities. The second is placing human work practices at the centre of theory building as a core to achieving this understanding. These two arguments are at the heart of the discussions between the authors of this book.
From Error … to Working within Fallible Systems
The history of human factors research and its account of error and accidents has its own developmental trajectory. In the early days of civil aviation, for example, the main focus on post-accident accounts of error was individual human blame. Realizing this accomplished little, the focus shifted from one of individual blame to post-hoc analyses of accidents that attempted to be more systemic (Maurino 1995).
However, these accounts also came under criticism for their sometimes overly simplistic accounts of a linear cause-effect approach. And once more, while they offered a more satisfying account of what accounted for the error, they remained, for the most part, overly focused on what went wrong, typically in hindsight (Woods and Cook 1999).
One of the key contributions of this phase of attempting to understand the causes of accidents more holistically was the attention given to what James Reason (1990) called ‘latent factors’ (for example, training, organizational policies, organizational culture). These insights led to the rise of programs associated with crew resource management, where the focus was on strategies to help crew members draw on the communicative resources available within the team and not just technical abilities, to delegate tasks and assign responsibilities, facilitate communication and crosschecking (Helmreich and Foushee 1993).
Subsequent iniatives to improve the safety of high reliability organizations led to the development of various iterations of crew resource management training, specifically designed to decrease the probability of errors occurring before they had a chance to impact (Helmreich and Weiner 1993). Such studies shed insight into the different types of errors made in practice, why they were made and the conditions under which types of error are made (for example, Weigmann and Shappell 1997). Yet still the focus was on error, which was fortunately becoming even more rare in high reliability environments (see Amalberti 2001; Reason 1997, 2004).
These studies migrated into accounts of safety through investigating threats and risks in normal everyday operations. While the focus of these studies (see, for example, Hawkins 1993; Johnston, MacDonald and Fuller 1997) have moved from investigating accidents to addressing how operators mitigate threats, and associated risks attention still remains on mistakes and abnormality.
The pervasiveness of error analyses
Error persists as a focus for at least three reasons. One possible explanation is that there are multiple purposes that the error analysis and conclusion will serve. There is, for example, a need to identify and attribute responsibilities for the incident or the accident after the fact. This has to do with accountability. A human error-cause is linked to the idea that only humans can be held accountable. A second reason has to do with learning. Given the enormous consequences of failure in complex systems, what can be learned so that similar mistakes might be avoided in the future? The third reason is also deeply anchored in our culture. We need to have faith in the systems we create and the technologies we use. The notion of an ‘error’ allows us to localize and isolate a specific deviation from ‘designed’ performance criteria, a deviation that can be repaired. Recognizing (in the sense of admitting) and repairing the ‘error’ is important in re-establishing trust in particular technologies that have failed but also, in a more general sense, to maintain or re-establish trust in the complex technological system on which our technological culture has become so dependent. Many different types of analysis in this book will focus attention on what is problematic in each of these views.
Toward Human-centred Work Practices
For some time there has been a growing consensus that error-focused analyses of human work in complex systems is inadequate and that new research methodologies and approaches are needed (Dekker 2006; Hollnagel 2005; Hollnagel and Woods 1999; Rasmussen 1990, 2000, 2003; Reason 1990; Vicente 1999; Weick 1999).
This has occurred for two reasons. As indicated, given that catastrophic accidents were rare, it was questionable what contribution they could offer to understanding what happens in workplaces most of the time. Second, there was an increasing dissatisfaction with such approaches that were based on understanding what happens in complex systems only through analysis of what happens when things go wrong and an increasing desire to move beyond ‘deficit’ models (e.g., Flach 1999; Hollnagel 2005). So, within the human factors and safety science literature is increasing dissatisfaction with using the study of accidents and error as the only means by which we can understand resilience in complex systems. However, although there are many exhortations in the literature to ‘shift the focus beyond error’ (Hollnagel 1993, 2005; Hollnagel and Woods 1999), such analyses have often still remained centred on the loss of comprehension or the failure. This book takes for granted the importance of developing holistic ways of understanding everyday human work in complex environments (Rochlin 1999; Rasmussen 1990, 2000; Reason 1990).
Redefining the problem of working within risky environments
Our claim here is that the everyday work environment is inherently imperfect whether or not the system is in a degraded mode state. The contributions in this book begin to develop this shift toward a more human-centred focus on work within fallible systems by illustrating the ways in which humans contribute positively to mitigating risk, despite fallible technologies, unrealistic rules and procedures in organizations and systems that are inherently imperfect, by their very nature.
However, it is also acknowledged that a positive approach to examining work within risky environments is not completely new. In 1967, Perrow proposed a classification of tasks based on their variability, and suggested that a high variability with a low predictability required a flexible and decentralized work organization in which the worker must build understanding and solve problems as and when they appeared. Rasmussen (2000) noted that, as routine tasks are progressively automated, the importance given to activity in situations and to user intelligence has constantly grown. Through their activities within risky environments and within imperfect systems, humans mitigate risk. However, a key challenge yet to be fully addressed in the literature is: How are we to better understand and support professional practices within fallible systems?
The book urges readers to recognize that often technologies are fallible. Indeed, the chapters in this book provide strong empirical evidence for the argument that reliable performance occurs in complex environments despite often conflicting constraints imposed by divergent organizational goals with respect to safety, availability, profitability and regularity. Devices or rules and procedures are not sufficient for achieving successful, safe and reliable work. Moreover, safety cannot be achieved by strict proceduralization or any other form of standardization in complex systems.
The contributors use a diverse range of theoretical perspectives to examine human work within complex technological systems by bringing together empirical research. They draw on foundational studies in human factors analysis such as cognitive ergonomics and complexity theory as well as others, such as Cultural Historical Activity Theory (CHAT) and approaches from the field of science and technology studies (STS), specifically, actor-network theory. There are a number of French-speaking authors in the volume. The French-speaking ergonomic community has a rich history and has elaborated on French-language concepts not well known to English-speaking audiences.
Writing from different disciplines and theoretical perspectives, the contributors in the book highlight differing questions, levels of analysis, problems and solutions. But in spite of their diversity, they share the same aim: drawing attention to individual, collective, systemic and technological resources that can be used to accomplish successful work in risky environments. Thus the diversity of approaches taken in the book provides the reader with a set of theoretical resources with which to analyze human work in complex environments.
For example, Marc and Rogalski (Chapter 6) analyze why, despite a high level of errors made by individuals in an emergency communications centre, at the level of the collective, safety remains high. If theoretical development relies only on developing an understanding of the health of systems through the lens of error analysis, then these kinds of accomplishments will remain unidentified.
Several analyses of serious accidents are also outlined in this book. However, rather than identifying a culprit, contributors analyze the socially constructed and contextual nature of error and accident inquiries. The importance of these studies is that they draw into their analysis the paradigms, assumptions and political pressures that are inherent in the work of accident investigation, something not often acknowledged in traditional forms of error analysis.
Despite their differing theoretical orientations, the focus for the contributors here is to define the problems to be solved starting from the human work that is undertaken. In analysing how humans (in and through the systems they create) mitigate risks, this focus begins to articulate strategies for developing, designing and expanding individual, collective, organizational and technological resources for work in risky environments.
There are paradoxes faced by workers operating within fallible systems. Double-binds are present in much work practice, for example, between the need for safety on the one hand and the need for efficiency on the other; or between working within the bounds of procedures, and needing to improvise within action. In part, such paradoxes stem from the managerial assumptions about ‘bounded rationality’ (Shotter 1983), that is, the problem of uncertainty in work can be reduced by encoding information into organizational structure. The more completely the structure of an organization can provide needed information, the less uncertainty there will be: the human operator just needs to follow the course of known rational action.
As already indicated, a key theme in this book is that rules and procedures are not sufficient for the successful achievement of safe and reliable work. Here, we analyze the ways in which people make decisions in the interface between what is procedure and what is sometimes required as improvised action. A key question then for the contributions is: what light can they shed on the balance between needed prescription and proceduralization, and initiative that can be encouraged and enabled? What needs to be decided and what can be left to the decision-maker to be created in action?
One solution to such paradoxes can be found in the process of problem ‘visibilization’ in organization (Engeström 1999; Rasmussen 2000). The process of making the problem visible is important during analysis and needs to be used in this context (Leplat 2005; Hollnagel 2005). However, organizations are paradoxical places. Making problems visible inside an organization needs to be undertaken carefully.
Under some conditions there are systemic contradictions in organizations where it is simply not acceptable to make visible the violation of a procedure. This is because of the close relationship between knowledge and power. Problem visibilization is knowledge, intertwined within power relations. If people feel that they will be exposed and vulnerable (because of organizational or public sanctions) for ...

Table of contents

  1. Cover
  2. Half Title
  3. Title Page
  4. Copyright Page
  5. Table of Contents
  6. List of Figures
  7. List of Tables
  8. About the Contributors
  9. 1 Introduction: Shifting the Focus to Human Work within Complex Socio-technical Systems
  10. PART I IDENTIFYING SYSTEM VULNERABILITIES WITHIN INCIDENT AND ACCIDENT ANALYSIS
  11. PART II ACCOMPLISHING RELIABILITY WITHIN FALLIBLE SYSTEMS
  12. PART III ENHANCING WORK PRACTICES WITHIN RISKY ENVIRONMENTS
  13. Index
Citation styles for Risky Work Environments

APA 6 Citation

Béguin, P. (2017). Risky Work Environments (1st ed.). CRC Press. Retrieved from https://www.perlego.com/book/1567330/risky-work-environments-reappraising-human-work-within-fallible-systems-pdf (Original work published 2017)

Chicago Citation

Béguin, Pascal. (2017) 2017. Risky Work Environments. 1st ed. CRC Press. https://www.perlego.com/book/1567330/risky-work-environments-reappraising-human-work-within-fallible-systems-pdf.

Harvard Citation

Béguin, P. (2017) Risky Work Environments. 1st edn. CRC Press. Available at: https://www.perlego.com/book/1567330/risky-work-environments-reappraising-human-work-within-fallible-systems-pdf (Accessed: 14 October 2022).

MLA 7 Citation

Béguin, Pascal. Risky Work Environments. 1st ed. CRC Press, 2017. Web. 14 Oct. 2022.