Operational Risk Management
eBook - ePub

Operational Risk Management

A Practical Approach to Intelligent Data Analysis

  1. English
  2. ePUB (mobile friendly)
  3. Available on iOS & Android
eBook - ePub

Operational Risk Management

A Practical Approach to Intelligent Data Analysis

Book details
Book preview
Table of contents
Citations

About This Book

Models and methods for operational risks assessment and mitigation are gaining importance in financial institutions, healthcare organizations, industry, businesses and organisations in general. This book introduces modern Operational Risk Management and describes how various data sources of different types, both numeric and semantic sources such as text can be integrated and analyzed. The book also demonstrates how Operational Risk Management is synergetic to other risk management activities such as Financial Risk Management and Safety Management.

Operational Risk Management: a practical approach to intelligent data analysis provides practical and tested methodologies for combining structured and unstructured, semantic-based data, and numeric data, in Operational Risk Management (OpR) data analysis.

Key Features:

  • The book is presented in four parts: 1) Introduction to OpR Management, 2) Data for OpR Management, 3) OpR Analytics and 4) OpR Applications and its Integration with other Disciplines.
  • Explores integration of semantic, unstructured textual data, in Operational Risk Management.
  • Provides novel techniques for combining qualitative and quantitative information to assess risks and design mitigation strategies.
  • Presents a comprehensive treatment of "near-misses" data and incidents in Operational Risk Management.
  • Looks at case studies in the financial and industrial sector.
  • Discusses application of ontology engineering to model knowledge used in Operational Risk Management.

Many real life examples are presented, mostly based on the MUSING project co-funded by the EU FP6 Information Society Technology Programme. It provides a unique multidisciplinary perspective on the important and evolving topic of Operational Risk Management. The book will be useful to operational risk practitioners, risk managers in banks, hospitals and industry looking for modern approaches to risk management that combine an analysis of structured and unstructured data. The book will also benefit academics interested in research in this field, looking for techniques developed in response to real world problems.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Operational Risk Management by Ron S. Kenett, Yossi Raanan, Ron S. Kenett, Yossi Raanan in PDF and/or ePUB format, as well as other popular books in Negocios y empresa & Seguros. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Wiley
Year
2011
ISBN
9781119956723
Edition
1
Subtopic
Seguros
Part I
INTRODUCTION TO OPERATIONAL RISK MANAGEMENT
1
Risk management: a general view
Ron S. Kenett, Richard Pike and Yossi Raanan
1.1 Introduction
Risk has always been with us. It has been considered and managed since the earliest civilizations began. The Old Testament describes how, on the sixth day of creation, the Creator completed his work and performed an ex post risk assessment to determine if further action was needed. At that point in time, no risks were anticipated since the 31st verse of Genesis reads ‘And God saw every thing that he had made, and, behold, it was very good’ (Genesis 1: 31).
Such evaluations are widely conducted these days to determine risk levels inherent in products and processes, in all industries and services. These assessments use terms such as ‘probability or threat of a damage’, ‘exposure to a loss or failure’, ‘the possibility of incurring loss or misfortune’. In essence, risk is linked to uncertain events and their outcomes. Almost a century ago, Frank H. Knight proposed the following definition:
Risk is present where future events occur with measureable probability.
Quoting more from Knight:
Uncertainty must be taken in a sense radically distinct from the familiar notion of risk, from which it has never been properly separated… The essential fact is that ‘risk’ means in some cases a quantity susceptible of measurement, while at other times it is something distinctly not of this character; and there are far-reaching and crucial differences in the bearings of the phenomena depending on which of the two is really present and operating…. It will appear that a measurable uncertainty, or ‘risk’ proper, as we shall use the term, is so far different from an unmeasurable one, that it is not in effect an uncertainty at all’.
(Knight, 1921)
According to Knight, the distinction between risk and uncertainty is thus a matter of knowledge. Risk describes situations in which probabilities are available, while uncertainty refers to situations in which the information is too imprecise to be summarized by probabilities. Knight also suggested that uncertainty can be grasped by an ‘infinite intelligence’ and that to analyse these situations theoreticians need a continuous increase in knowledge. From this perspective, uncertainty is viewed as a lack of knowledge about reality.
This separates ‘risk’ from ‘uncertainty’ where the probability of future events is not measured. Of course what are current uncertainties (e.g. long-range weather forecasts) may some day become risks as science and technology make progress.
The notion of risk management is also not new. In 1900, a hurricane and flood killed more than 5000 people in Texas and destroyed the city of Galveston in less than 12 hours, materially changing the nature and scope of weather prediction in North America and the world. On 19 October 1987, a shock wave hit the US stock market, reminding all investors of the inherent risk and volatility in the market. In 1993, the title of ‘Chief Risk Officer’ was first used by James Lam, at GE Capital, to describe a function to manage ‘all aspects of risk’ including risk management, back-office operations, and business and financial planning. In 2001, the terrorism of September 11 and the collapse of Enron reminded the world that nothing is too big to collapse.
To this list, one can add events related to 15 September 2008, when Lehman Brothers announced that it was filing for Chapter 11 bankruptcy protection. Within days, Merrill Lynch announced that it was being sold to rival Bank of America at a severely discounted price to avert its own bankruptcy. Insurance giant AIG, which had previously received an AAA bond rating (one of only six US companies to hold an AAA rating from both Moody’s and S&P) stood on the brink of collapse. Only an $85 billion government bailout saved the company from experiencing the same fate as Lehman Brothers. Mortgage backers Fannie Mae and Freddie Mac had previously been put under federal ‘governorship’, to prevent the failure of two major pillars in the US mortgage system. Following these events, close to 1000 financial institutions have shut down, with losses up to $3600 billion.
The car industry has also experienced such events. After Toyota announced a recall of 2.3 million US vehicles on 21 January 2010, its shares dropped 21%, wiping out $33 billion of the company’s market capitalization. These widely publicized events keep reinvigorating risk management.
The Food and Drug Administration, National Aeronautics and Space Administration, Department of Defense, Environmental Protection Agency, Securities and Exchange Commission and Nuclear Regulatory Commission, among others, have all being implementing risk management for over a decade. Some basic references that form the basis for these initiatives include: Haimes (2009), Tapiero (2004), Chorafas (2004), Ayyub (2003), Davies (1996) and Finkel and Golding (1994).
Risk management, then, has long been a topic worth pursuing, and indeed several industries are based on its successful applications, insurance companies and banks being the most notable. What gives this discipline enhanced attention and renewed prominence is the belief that nowadays we can do a better job of it. This perception is based on phenomenal developments in the area of data processing and data analysis. The challenge is to turn ‘data’ into information, knowledge and deep understanding (Kenett, 2008). This book is about meeting this challenge. Many of the chapters in the book are based on work conducted in the MUSING research project. MUSING stands for MUlti-industry, Semantic-based next generation business INtelliGence (MUSING, 2006). This book is an extended outgrowth of this project whose objectives were to deliver next generation knowledge management solutions and risk management services by integrating Semantic Web and human language technologies and to combine declarative rule-based methods and statistical approaches for enhancing knowledge acquisition and reasoning. By applying innovative technological solutions in research and development activities conducted from 2006 through 2010, MUSING focused on three application areas:
1. Financial risk management. Development and validation of next generation (Basel II and beyond) semantic-based business intelligence (BI) solutions, with particular reference to credit risk management and access to credit for enterprises, especially small and medium-sized enterprises (SMEs).
2. Internationalization. Development and validation of next generation semantic-based internationalization platforms supporting SME internationalization in the context of global competition by identifying, capturing, representing and localizing trusted knowledge.
3. Operational risk management. Semantic-driven knowledge systems for operational risk measurement and mitigation, in particular for IT-intensive organizations. Management of operational risks of large enterprises and SMEs impacting positively on the related user communities in terms of service levels and costs.
Kenett and Shmueli (2009) provide a detailed exposition of how data quality, analysis quality and information quality are all required for achieving knowledge with added value to decision makers. They introduce the term InfoQ to assess the quality of information derived from data and its analysis and propose several practical ways to assess it. The eight InfoQ dimensions are:
1. Data granularity. Two aspects of data granularity are measurement scale and data aggregation. The measurement scale of the data must be adequate for the purpose of the study and. The level of aggregation of the data should match the task at hand. For example, consider data on daily purchases of over-the-counter medications at a large pharmacy. If the goal of the analysis is to forecast future inventory levels of different medications, when restocking is done on a weekly basis, then we would prefer weekly aggregate data to daily aggregate data.
2. Data structure. Data can combine structured quantitative data with unstructured, semantic-based data. For example, in assessing the reputation of an organization one might combine data derived from balance sheets with data mined from text such as newspaper archives or press reports.
3. Data integration. Knowledge is often spread out across multiple data sources. Hence, identifying the different relevant sources, collecting the relevant data and integrating the data directly affects information quality.
4. Temporal relevance. A data set contains information collected during a certain period of time. The degree of relevance of the data to the current goal at hand must be assessed. For instance, in order to learn about current online shopping behaviours, a data set that records online purchase behaviour (such as Comscore data, www.comscore.com) can be irrelevant if it is even one year old, because of the fast-changing online shopping environment.
5. Sampling bias. A clear definition of the population of interest and how a sample relates to that population is necessary in both primary and secondary analyses. Dealing with sampling bias can be proactive or reactive. In studies where there is control over the data acquisition design (e.g. surveys), sampling schemes are selected to reduce bias. Such methods do not apply to retrospective studies. However, retroactive measures such as post-stratification weighting, which are often used in survey analysis, can be useful in secondary studies as well.
6. Chronology of data and goal. Take, for example, a data set containing daily weather information for a particular city for a certain period as well as information on the air quality index (AQI) on those days. For the United States such data is publicly available from the National Oceanic and Atmospheric Administration website (www.noaa.gov). To assess the quality of the information contained in this data set, we must consider the purpose of the analysis. Although AQI is widely used (for instance, for issuing a ‘code red’ day), how it is computed is not easy to figure out. One analysis goal might therefore be to find out how AQI is computed from weather data (by reverse engineering). For such a purpose, this data is likely to contain high-quality information. In contrast, if the goal is to predict future AQI levels, then the data on past temperatures contains low-quality information.
7. Concept operationalization. Observable data is an operationalization of underlying concepts. ‘Anger’ can be measured via a questionnaire or by measuring blood pressure; ‘economic prosperity’ can be measured via income or by unemployment rate; and ‘length’ can be measured in centimetres or in inches. The role of concept operationalization is different for explanatory, predictive and descriptive goals.
8. Communication and data visualization. If crucial information does not reach the right person at the right time, then the quality of information becomes poor. Data visualization is also directly related to the quality of information. Poor visualization can lead to degradation of the information contained in the data.
Effective risk management necessarily requires high InfoQ. For more on information quality see Guess (2000), Redman (2007) and Kenett (2008).
We are seeking knowledge and require data in order to start the chain of reasoning. The potential of data-driven knowledge generation is endless when we consider both the increase in computational power and the decrease in computing costs. When combined with essentially inexhaustible and fast electronic storage capacity, it seems that our ability to solve the intricate problems of risk management has stepped up several orders of magnitude higher.
As a result, the position of chief risk officer (CRO) in organizations is gaining popularity in today’s business world. Particularly after the 2008 collapse of the financial markets, the idea that risk must be better managed than it had been in the past is now widely accepted (see Kenett, 2009). Still, this position is not easy to handle properly. In a sense it is a new version of the corporate quality manager position which was popular in the 1980s and 1990s. One of the problems inherent in risk management is its almost complete lack of glamour. Risk management done well is treated by most people like electric power or running water – they expect those resources to be ever present, available when needed, inexpensive and requiring very little management attention. It is only when they are suddenly unavailable that we notice them. Risks that were well managed did not materialize, and their managers got little attention. In general, risk management positions provide no avenues to corporate glory. Indeed, many managers distinguish themselves in times of crisis and would have gone almost completely unnoticed in its absence. Fire fighting is still a very prevalent management style. Kenett et al. (2008) formulated the Statistical Efficiency Conjecture that stipulates that organizations exercising fire fighting, as opposed to process improvement of quality by design, are less effective in their improvement initiatives. This was substantiated with 21 case studies which were collected and analysed to try to convince management that prevention is carrying significant rewards.
An example of this phenomenon is the sudden glory bestowed on Rudy Giuliani, the former Mayor of New York City, because of his exceptional crisis management in the aftermath of the September 11 terrorist attack on the twin towers. It was enough to launch his bid for the presidency (although not enough, apparently, to get him elected to that office or even to the post of Republican candidate). Had the attacks been avoided, by a good defence intelligence organization, he would have remained just the Mayor of New York City. The people who would have been responsible for the prevention would have got no glory at all, and we might even never have heard about them or about that potential terrible threat that had been thwarted. After all, they were just doing their job, so what is there to brag about? Another reason for not knowing about the thwarted threat, valid also for business risk mitigation strategies, is not exposing the methods, systems and techniques that enabled the thwarting.
Nonetheless, risk management is a critically important job for organizations, much like vaccination programmes. It must be funded properly and given enough resources, opportunities and management attention to achieve concrete results, since it can be critical to the organization’s survival. One should not embrace this discipline only after disaster strikes. Organizations should endeavour to prevent the next one by taking calculated, evidence-based, measured steps to avoid the consequences of risk, and that means engaging in active risk management.
1.2 Definitions of risk
As a direct result of risk being a statistical distribution rather than a discrete point, there are two main concepts in risk measurement that must be understood in order to carry out effective risk management:
1. Risk impact. The impact (financial, reputational, regulatory, etc.) that will happen should the risk event occur.
2. Risk likelihood. The probability of the risk event occurring.
This likelihood usually has a time period associated with it. The likelihood of an event occurring during the coming week is quite different from the likelihood of the same event occurring during the coming year. The same holds true, to some extent, for the risk impact since the same risk event occurring in two different points in time may result in different impacts. These differences between the various levels of impact may even owe their existence to the fact that the organization, realizing that the event might happen, has engaged actively in risk management and, at the later of the two time periods, was better prepared for the event and, although it could not stop it from happening, it succeeded in reducing its impact.
Other base concepts in the risk arena include:
  • Risk event. An actual instance of a risk that happened in the past.
  • Risk cause. The preceding activity that triggers a risk event (e.g. fire was caused by faulty electrical equipment sparking).
Risk itself has risk, as measures of risk often are subject to possible change and so measures of risk will often come with a confidence level that tells the reader what the risk of the risk measure is. That is, there may be some uncertainty about the prediction of risk but of course this should never be a reason to avoid the sound practice of risk management, since its application has generated considerable benefits even with less than certain predictions.
1.3 Impact of risk
In her book Oracles, Curses & Risk Among the Ancient Greeks, Esther Eidinow shows how the Greeks managed risk by consulting oracles and placing curses on people that affected their lives (Eidinow, 2007). She also posits that risk management is not just a way of handling objective external dangers but is socially constructed and therefore, information about how a civilization perceives risk, provides insights into its social dynamics and view of the world. The type of risks we are concerned with, at a given point in time, also provides insights into our mindset. Specifically, the current preponderance on security, ecological and IT risks would make excellent research material for an anthropologist in 200 years.
This natural tendency to focus on specific types of risk at certain times causes risk issues, as it is exactly the risks you have not been focusing on that can jump up and bite you. In his book The Black Swan, Nassim Nicholas Taleb describes events that have a very low probability of occurrence but can have a very great impact (Taleb, 2007). Part of the reasons he gives for these unexpected events is that we have not been focusing on them or their possibilities because of the underlying assumptions we made about our environment (i.e. all swans are white).
It is also true that the impact of many risk events is difficult to estimate precisely, since often one risk event triggers another, sometimes even a chain reaction, and then the measurements tend to become difficult. This distribution of the total impact of a compound event among its components is not of great importance during an initial analysis of risks. We would be interested in the whole, and not in the parts, since our purpose is to prevent the impact. Subsequent, finer, analysis may indeed assign the impacts to the component parts if their happening separately is deemed possible, or if it is possible (and desirable) to manage them separately. A large literature exists on various aspects or risk assessment and risk management. See for example Alexander (1998), Chorafas (2004), Doherty (2000), Dowd (1998), Embrecht et al. (1997), Engelmann and Rauhmeier (2006), Jorion (1997), Kenett and Raphaeli (2008), Kenett and Salini (2008), Kenett and Tapiero (2009), Panjer (2006), Tapiero (2004) and Van den Brink (2002).
1.4 Types of risk
In order to mitigate risks the commercial world is developing holistic risk management programmes and approaches under the banner of enterprise risk management (ERM). This framework aims to ensure that all types of risk are considered and attempts are made to compare different risk types within one overall risk measurement approach. There are many ERM frameworks available, but one of the most prevalent is the COSO ERM model created by the Committee of Sponsoring Organizations of the Treadway Commission. This framework categorizes risks within the following types: (1) financial, (2) operational, (3) legal/comp...

Table of contents

  1. Cover
  2. Statistics in Practice
  3. Title Page
  4. Copyright Page
  5. Dedication
  6. Foreword
  7. Preface
  8. Introduction
  9. Notes on Contributors
  10. List of Acronyms
  11. Part I: INTRODUCTION TO OPERATIONAL RISK MANAGEMENT
  12. Part II: DATA FOR OPERATIONAL RISK MANAGEMENT AND ITS HANDLING
  13. PART III: OPERATIONAL RISK ANALYTICS
  14. Part IV: OPERATIONAL RISK APPLICATIONS AND INTEGRATION WITH OTHER DISCIPLINES
  15. Statistics in Practice
  16. Index