Embedded Software Development for Safety-Critical Systems, Second Edition
eBook - ePub

Embedded Software Development for Safety-Critical Systems, Second Edition

Chris Hobbs

Share book
  1. 366 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Embedded Software Development for Safety-Critical Systems, Second Edition

Chris Hobbs

Book details
Book preview
Table of contents
Citations

About This Book

This is a book about the development of dependable, embedded software. It is for systems designers, implementers, and verifiers who are experienced in general embedded software development, but who are now facing the prospect of delivering a software-based system for a safety-critical application. It is aimed at those creating a product that must satisfy one or more of the international standards relating to safety-critical applications, including IEC 61508, ISO 26262, EN 50128, EN 50657, IEC 62304, or related standards.

Of the first edition, Stephen Thomas, PE, Founder and Editor of FunctionalSafetyEngineer.com said, "I highly recommend Mr. Hobbs' book."

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Embedded Software Development for Safety-Critical Systems, Second Edition an online PDF/ePUB?
Yes, you can access Embedded Software Development for Safety-Critical Systems, Second Edition by Chris Hobbs in PDF and/or ePUB format, as well as other popular books in Informatik & Softwareentwicklung. We have over one million books available in our catalogue for you to explore.

Information

Publisher
CRC Press
Year
2019
ISBN
9781000507331
Edition
2
Background
I
Chapter 1
Introduction
We’re entering a new world in which data may be more important than software.
Tim O’Reilly
This is a book about the development of dependable, embedded software.
It is traditional to begin books and articles about embedded software with the statistic of how many more lines of embedded code there are in a modern motor car than in a modern airliner. It is traditional to start books and articles about dependable code with a homily about the penalties of finding bugs late in the development process — the well-known exponential cost curve.
What inhibits me from this approach is that I have read Laurent Bossavit’s wonderful book, The Leprechauns of Software Engineering (reference [1]), which ruthlessly investigates such “well-known” software engineering preconceptions and exposes their lack of foundation.
In particular, Bossavit points out the circular logic associated with the exponential cost of finding and fixing bugs later in the development process: “Software engineering is a social process, not a naturally occurring one — it therefore has the property that what we believe about software engineering has causal impacts on what is real about software engineering.” It is precisely because we expect it to be more expensive to fix bugs later in the development process that we have created procedures that make it more expensive.
Bossavit’s observations will be invoked several times in this book because I hope to shake your faith in other “leprechauns” associated with embedded software. In particular, the “100 million lines of code in a modern car” seems to have become a mantra from which we need to break free.
Safety Culture
A safety culture is a culture that allows the boss to hear bad news.
Sidney Dekker
Most of this book addresses the technical aspects of building a product that can be certified to a standard, such as IEC 61508 or ISO 26262. There is one additional, critically important aspect of building a product that could affect public safety — the responsibilities carried by the individual designers, implementers and verification engineers. It is easy to read the safety standards mechanically, and treat their requirements as hoops through which the project has to jump, but those standards were written to be read by people working within an established safety culture.
Anecdote 1I first started to think about the safety-critical aspects of a design in the late 1980s when I was managing the development of a piece of telecommunications equipment.
A programmer, reading the code at his desk, realized that a safety check in our product could be bypassed. When a technician was working on the equipment, the system carried out a high-voltage test on the external line as a safety measure. If a high voltage was present, the software refused to close the relays that connected the technician’s equipment to the line.
The fault found by the programmer allowed the high-voltage check to be omitted under very unusual conditions.
I was under significant pressure from my management to ship the product. It was pointed out that high voltages rarely were present and, even if they were, it was only under very unusual circumstances that the check would be skipped.
At that time, I had none of the techniques described in this book for assessing the situation and making a reasoned and justifiable decision available to me. It was this incident that set me off down the road that has led to this book.
Annex B of ISO 26262-2 provides a list of examples indicative of good or poor safety cultures, including “groupthink” (bad), intellectual diversity within the team (good), and a reward system that penalizes those who take short-cuts that jeopardize safety (good).
Everyone concerned with the development of a safety-critical device needs to be aware that human life may hang on the quality of the design and implementation.
The official inquiry into the Deepwater Horizon tragedy (reference [2]) specifically addresses the safety culture within the oil and gas industry: “The immediate causes of the Macondo well blowout can be traced to a series of identifiable mistakes made by BP, Halliburton, and Transocean that reveal such systematic failures in risk management that they place in doubt the safety culture of the entire industry.”
The term “safety culture” appears 116 times in the official Nimrod Review (reference [3]) following the investigation into the crash of the Nimrod aircraft XV230 in 2006. In particular, the review includes a whole chapter describing what is required of a safety culture and explicitly states that “The shortcomings in the current airworthiness system in the MOD are manifold and include 
 a Safety Culture that has allowed ‘business’ to eclipse Airworthiness.”
In a healthy safety culture, any developer working on a safety-critical product has the right to know how to assess a risk, and has the duty to bring safety considerations forward.
As Les Chambers said in his blog in February 2012† when commenting on the Deepwater Horizon tragedy:
We have an ethical duty to come out of our mathematical sandboxes and take more social responsibility for the systems we build, even if this means career threatening conflict with a powerful boss. Knowledge is the traditional currency of engineering, but we must also deal in belief.
One other question that Chambers addresses in that blog posting is whether it is acceptable to pass a decision “upward.” In the incident described in Anecdote 1, I refused to sign the release documentation and passed the decision to my boss. Would that have absolved me morally or legally from any guilt in the matter, had the equipment been shipped and had an injury resulted? In fact, my boss also refused to sign and shipment was delayed at great expense.
Anecdote 2At a conference on safety-critical systems that I attended a few years back, a group of us were chatting during a coffee break. One of the delegates said that he had a friend who was a lawyer. This lawyer quite often defended engineers who had been accused of developing a defective product that had caused serious injury or death. Apparently, the lawyer was usually confident that he could get the engineer proven innocent if the case came to court. But in many cases the case never came to court because the engineer had committed suicide. This anecdote killed the conversation, as we reflected on its implications for each of us personally.
Our Path
I have structured this book as follows
Background material.
Chapter 2 introduces some of the terminology to be found later in the book. This is important because words such as fault, error, and failure, often used interchangeably in everyday life, have ...

Table of contents