Human Forces in Engineering
eBook - ePub

Human Forces in Engineering

  1. 169 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Human Forces in Engineering

Book details
Book preview
Table of contents
Citations

About This Book

This book aims to provide engineers with an overview knowledge of disciplines such as sociopolitics, psychology, economics, and leadership.

Engineers are disproportionately represented in senior management and in leadership roles, and many work outside typical engineering roles. Vital to their success are technical skills, but also, crucially, an understanding of the societal setting within which engineering takes place. Engineers that leverage their technical and analytical abilities with an understanding of the social context are enormously successful, both professionally and in terms of broader impact.

This book originated from a recognition that this capacity of engineers can be enhanced with an understanding of the 'human forces', the phenomena that underpin and govern human interactions. The key ideas were assembled with domain experts from each field, to provide the key critical insights and how these might be practically applied by engineers. The authors provide the basis for the learning necessary to guide high-level strategic decisions, manage teams of diverse skillsets in complex environments, communicate in the context of management and decision-making, and to excel at the interface between a technical discipline and non-scientific fields.

Prof. Andrej Atrens is Professor of Materials Engineering at The University of Queensland (UQ). He has experience in Universities and Research Institutes in Switzerland, Thailand, Canada, France, Germany, Sweden, China, USA, Fiji and Australia.

Dr. Aleks Atrens is an Honorary Research Fellow at The University of Queensland (UQ). He earned his BE (Hons) in Chemical Engineering in 2007, and his PhD in 2011, both at UQ, where he has subsequently been a lecturer and researcher.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Human Forces in Engineering by Andrej Atrens, Aleks David Atrens in PDF and/or ePUB format, as well as other popular books in Technology & Engineering & Engineering General. We have over one million books available in our catalogue for you to explore.

Information

Aleks D. Atrens, Alexander K. Saeri

Psychology

Abstract: This chapter provides an introduction to key elements of psychology relevant to engineers. Psychology is the science of human mental states and behaviour. It aims to explain and predict beliefs, decisions, and actions of people as individuals and in groups. A working knowledge of psychology will assist with: identifying biased or erroneous reasoning and improving decision-making; promoting effective teamwork and coordination; understanding effective leadership; and recognising the importance of ‘human factors’ in designing resilient engineering and technology systems.
Engineers appraise challenges and design solutions. Understanding and predicting human mental states and behaviour is essential for accurate appraisal of a challenge, and effective design of a solution. Humans form part of every system, either embedded (e.g., a required behaviour within a process), individually (e.g., as a designer of the system), or socially (e.g., as a member of a team that must coordinate action). Humans cannot be ‘designed out’ from a system. Even if humans are not embedded within the system itself, human judgement, decision-making, and social processes are still inextricably linked with who engineers are, and what they do.
Key Concepts: Modes of cognition; cognitive heuristics and biases; expertise; group dynamics; norms; the psychology of leadership; the psychology of safety; human factors; resilience engineering.
Key Ideas:
  1. A useful model for human cognition is that judgements involve two modes of thinking: a fast, effortless, reflexive or intuitive mode of thinking (System 1), and a slower, effortful, and deliberative mode of thinking (System 2).
  2. System 1 thinking relies on learned associations (heuristics) to make fast decisions from limited information. It is very powerful, but is also prone to a range of errors (biases). As a consequence, human judgement in general is prone to characteristic errors.
  3. Expertise is the capacity to make accurate and useful judgements in a specific domain. An expert can integrate their experience and situational cues to quickly envisage a likely answer, or a path forward to make progress on a solution.
  4. Expertise can be developed only in an environment with regular and valid cues, and is based on extensive experience using these cues for rapid and accurate feedback. Expertise may be difficult or even impossible to develop in chaotic environments with low-quality feedback (high noise to signal ratio), or long delays between action and feedback.
  5. Expertise is domain-specific. Application of expertise across domains can be misleading, as confidence in one’s judgement is a poor indicator of accuracy and it can be difficult to distinguish between useful and useless intuitive judgements.
  6. Social identity theory states that people define themselves not only in terms of “I” (as individuals), but also in terms of “we” (as members of social groups). Every individual has many group memberships and thus many social identities. When team members endorse and work for the benefit of a shared social identity, they will better coordinate their actions and are more likely achieve shared success.
  7. A strong shared social identity can be a double-edged sword. We care about a social identity like we care about our personal identities. We will seek to uphold, protect, or enhance our group’s beliefs and values. Upholding a group’s values and distinctiveness may be positive in that it promotes coordination and good behaviours within a group. But it can also be negative in that protecting your group’s values and distinctiveness may mean conflicts with other groups.
  8. The psychological view of leadership is determined by the context of the group, including the group’s shared social identity and purpose, its internal structure, and the environment within which the group operates (e.g., other competing or cooperating groups, availability of resources). This view differs from the traditional concept of leadership as arising from traits or behaviours such as power or charisma.
  9. A leader in the psychological sense represents the group’s interests to other groups, makes decisions on behalf of the group regarding its purpose/goals, actions to attain the goal, and persuades group members to contribute to achieving the group’s goals.
  10. Leadership is conferred by followers and is maintained by successfully leading the group. Leadership is likely to be lost by acting against the interests of the group, deviating from group norms, or not advocating for the group’s goals to those outside the group.
  11. Humans are embedded in every engineering system. A system may appear technically reliable but be vulnerable to unexpected external hazards beyond the scope of its design.
  12. Existing safety approaches tend to design ever more complex engineering controls to remove or restrict human ‘error’ in a system. It may also be vulnerable to unexpected internal hazards if its design does not account for the bounded rationality, and group dynamics of its operators.
  13. A human factors approach seeks to incorporate a psychological understanding of how humans interact with designed systems to improve performance and safety.
  14. Resilience engineering recognises that the human operator exists to provide resilience and adaptability and seeks to empower them to act appropriately in a system.

1Introduction

From the outside, psychology may appear to be a clinical science or profession, focussing on the individual and their functioning. In this view, clinical psychologists seek to assist individuals with their mental illness or depression, and sports psychologists seek to improve athletic functioning. But the science of psychology is concerned with the understanding of the mind in general, and the understanding of human behaviour – of individuals and of groups.
Every system has a human element, whether it’s an engineering system, a political system or a social system. Most systems operate within a social context, taking into account regulatory environments, policy ideas, and organisational or team culture. Even carefully-designed mechanistic systems have an inescapable human element, in that all the details of the system are intimately tied to the decisions of the creators. Understanding thoughts and behaviour can help to shed light on how systems work, so as to improve systems, or identify when they might be going wrong.
This chapter introduces three areas of psychology of relevance to engineers. First, judgement and decision-making, and the development of expertise. Second, group dynamics, focussing on how individuals can lead groups and the way that groups make decisions. Finally, it addresses the relationship between human-system interactions and systems failure, and how systems can be improved through ‘human factors’ and resilience engineering.

2Judgement and decision-making

Our laws, politics, and many social structures are largely based on the assumption that people can and generally do make rational decisions. We have an impression that decisions are based on an even-handed interpretation of available information. This implies that when someone makes a mistake or a poor choice, they either were given the wrong information, or that they have some inherent incapacity: perhaps a lack of correct training, or maybe a lack of competence to make correct decisions – ‘poor judgement’. This is often evident in criticism when something goes wrong – people talk about ‘human error’, attributing mistakes to poor training, poor judgement, or a lack of correct information.
Such individual-focused explanations are incomplete and can be misleading. All of us, whether more or less skilled, have cognitive features that shape the way we process information. And the way that we process information and reach decisions is not perfectly rational. That by itself may not seem like much of an insight. But there are regularities in our irrationality. Even clever people can make errors in their thinking, and these errors tend to happen in predictable ways. We can then try to identify situations or types of judgements that could be influenced by our irrationality, and to design systems that work around them.
As people gain more skills or knowledge, typically they expect their decision making will improve – i.e., their error rate will be reduced as they gain expertise. While expertise does provide benefits, everyone has features of how they process information that can lead them to wrong conclusions. These features are what psychologists refer to as heuristics and biases.

2.1Modes of thinking

One of the approaches that psychologists use to understand human cognition is to conceptualise two different modes or styles of thinking (dual process theory) [1–3]. The first is an automatic, intuitive mode which involves rapid, seemingly effortless thought [3–5]. This mode of thinking, often referred to as System 1, is very powerful and allows quick and generally-accurate decision-making. However, as an associative and pattern-matching system, it is prone to specific errors in situations for which it is not well-suited. The other mode of thinking is slower, conscious, and analytic [3–5]. This more deliberative type of cognition is often referred to as System 2. It is a mode of thinking that requires more cognitive effort and more time. An example (from Frederick [6]) can illustrate the differences in processing between these two systems. Read the following question, and give your answer out loud:
“A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?”
What was your answer? A System 1 approach may be as follows: the two items together cost $1.10; the bat is associated with one dollar; after a dollar is removed, there’s 10 cents left; the ball must cost 10 cents. Is that right? When asked this question, a substantial proportion of respondents will give this answer.
But if we break the question down using a System 2 approach, it’s not that the bat costs one dollar by itself, it’s that it costs one more dollar than the ball. So the ball must cost 5 cents, and the bat must cost one dollar and 5 cents. If we check the snap judgement that the ball costs 10 cents, we add the cost of the ball (10 cents) to the cost of the bat ($1.10, i.e. $1 more than the cost of the ball) to get $1.20 and realise that the snap judgement was incorrect.
It’s easy when first seeing this type of question to immediately jump to the snap answer of 10 cents. Did you go through this same thought process? If so, you’ve just seen an example of how you can use System 1 to quickly answer a question, but potentially (as in this case) incorporate an error in the process. If you did get the correct answer, that’s great. But don’t assume that you are immune to these types of type of cognitive shortcuts. The same type of cognitive processing that leads to the incorrect answer is what allows us to recognise an emotion in a photograph, drive a car, catch a ball, or read and write.
System 1 thinking uses heuristics to incorporate previous knowledge through association, and typically is engaged in pattern matching or pattern recognition. It’s sometimes referred to as instinct or intuition, and has been denigrated as ‘a machine for jumping to conclusions’. It behaves as a fill-in-the-blank judgment process, whereby given a query or stimulus it produces an almost reflexive response. It’s been characterised as an effort-saving approach to cognition. So what’s happening cognitively in the ball and bat example above? The query is phrased in such a way that prompts people to reinterpret the query, recognise that it sounds familiar to other familiar mathematical problems, and reframe or substitute the true query with a similar, easier problem. This results in a subconscious conclusion that the question is a...

Table of contents

  1. Cover
  2. Title Page
  3. Copyright
  4. Preface
  5. Contents
  6. Acknowledgements
  7. List of Contributors
  8. Engineering in the modern world
  9. Psychology
  10. Socio-political analysis
  11. Engineering economics
  12. The economics of climate change
  13. The leadership challenge for engineers
  14. Concluding remarks