Technology & Engineering

Statistical Process Control (SQC)

Statistical Process Control (SQC) is a method used to monitor and control processes to ensure they operate efficiently and produce high-quality products. It involves using statistical techniques to analyze and improve processes, identify variations, and make data-driven decisions. By continuously monitoring and adjusting processes, SQC helps organizations maintain consistency and meet quality standards.

Written by Perlego with AI-assistance

12 Key excerpts on "Statistical Process Control (SQC)"

  • Quality Control for Dummies
    • Larry Webber, Michael Wallace(Authors)
    • 2011(Publication Date)
    • For Dummies
      (Publisher)
    Chapter 10 Assessing Quality with Statistical Process Control In This Chapter Digesting the basics of Statistical Process Control Becoming familiar with the role of the control chart in SPC Building control charts to evaluate and correct a process Determining whether a process is capable T he following quote is from the “father” of Statistical Process Control (SPC), Walter Shewhart. Shewhart pioneered the idea that all manufacturing processes produce some variation that’s natural to the processes and that quality problems arise when abnormal variation occurs. Shewhart was the first to use statistics to measure the natural variation of a process and detect the occurrence of unnatural variation. Bringing a production process into a state of ‘statistical control,’ where there is only chance-cause variation, and keeping it in control, is necessary to predict future output and to manage a process economically. — Walter Shewhart In this chapter, you find out how to use simple statistical techniques to, as Shewhart said, bring your production processes into a state of “statistical control” by eliminating variation. With this control and variance elimination, you can produce a quality product or service for your customers. Grasping the Basics of Statistical Process Control Statistical Process Control (SPC) is defined as the use of statistical tools and techniques to measure a production process in order to detect change. To produce a quality product, you must have a process that’s consistent. SPC helps you detect any changes in the process, because change is more times than not a bad thing. SPC is a philosophy that embraces the idea of continuous improvement brought on by using an assortment of statistical tools. The basis of the philosophy is that problems within a system cause most of your process issues, not problems with individual people
  • Quality Management in Plastics Processing
    Chapter 5

    Statistical process control (SPC)

    Statistical process control is one of the most important components of effective quality management. It acts as a ‘feed forward’ control to allow a processor to actually control quality rather than the typical inspection-based approach which is reactive and after the event.
    The sad thing is that SPC has been around for many years and yet has never really been taken seriously by the plastics processing industry. Individual companies have used SPC to great effect and have gained control of their process but the industry as a whole is still remarkably reluctant to use the methods and they are not often seen at processors.
    This is, in part, due to widespread misunderstanding of how the process works and the very quick benefits that it delivers. Most of the standard texts on SPC are written by statisticians who move very quickly over the basics and get very involved in the mathematics and formulas without explaining how SPC works in practice. Many people have a fear of mathematics and their eyes glaze over at the first sight of an equation. In reality, operating SPC requires nothing more than a basic grasp of arithmetic and the ability to see patterns in the results that are plotted as a result of this. Setting up a basic SPC system does need some higher skills but most of it is very simple and still only needs basic arithmetic.
    Anybody who can operate Excel can set up an SPC system and reap the benefits almost immediately.
    SPC naturally involves numbers and many off-the-shelf and bespoke systems have been developed (even I developed one in the dim and distant past). Many of these are excellent (sadly, not mine) and when the amount of data becomes large they can be useful to improve data handling. Their fundamental problem is that they take away all of the basics and present the users simply with numbers and little background information. This can divorce the user from the actual process and, by reducing interaction with the actual process, can reduce the value of the SPC system.
  • Six Sigma and the Product Development Cycle
    • Graham Wilson(Author)
    • 2012(Publication Date)
    • Routledge
      (Publisher)

    7

    Statistical process control

    What is statistical process control?

    In its modern form, statistical process control (SPC) has been available for nearly 70 years. It is a form of charting, based on sound statistical principles, which allows us to see how product or service processes are performing. Not only can SPC tell us when something is going wrong or beginning to go wrong, but when alarming changes occur, it can also tell us whether there is any point in trying to do something to remedy them.
    Properly applied, SPC is virtually foolproof. It is simple to use, involves little or no complicated mathematics, and almost guarantees to pay for itself in saved effort. However, despite such a long pedigree and all its selling features, SPC probably has one of the poorest adoption records of any management technique.
    There are quite a number of reasons for this, although I suspect that the most significant is the first word in its title: statistical. If you do not beat them to it, almost every SPC course has some wit who will recall Disraeli's comment: ‘There are lies, damned lies, and statistics’. For people who have a perception that they are not that good at mathematics, the word strikes a fear that they will not be able to cope. Most schools opt for the bizarrely misnamed ‘applied mathematics’, rather than statistics, as an A-level option. Statistics is sometimes seen as an irrelevance, while too others it is a ‘soft’ science, almost an art. Even university students have been known to look down on their statistics options as lacking rigour and worth.
    In practice, statistics probably has more industrial applications than any form of applied mathematics (of course, it is applied mathematics, but that phrase is strangely reserved for mechanics). It is perfectly rigorous, but deals with much larger numbers of events at any one time. This gives it a ‘cloud-like’ quality, which some people mistake for lack of clarity. In many cases, because it deals with the real world, the use of statistics is no harder than drawing a simple graph, unlike the complex proofs of mechanics.
  • Smart and Sustainable Operations and Supply Chain Management in Industry 4.0
    • Turan Paksoy, Muhammet Deveci, Turan Paksoy, Muhammet Deveci(Authors)
    • 2023(Publication Date)
    • CRC Press
      (Publisher)
    6  Statistical Process Control (SPC) and Quality Management Muzaffer Alım and Saadettin Erhan Kesen
    DOI: 10.1201/9781003180302-6

    CONTENTS

    • 6.1 Introduction
    • 6.1.1 Quality Control and Statistical Process Control
    • 6.1.2 Historical Evolution
    • 6.1.3 Statistical Process Control
    • 6.1.3.1 Control Charts
    • 6.1.3.1.1 Example of a Control Chart Application
    • 6.1.3.2 Process Capability Analysis
    • 6.2 Quality 4.0
    • 6.2.1 Benefits and Challenges of Q4.0
    • 6.2.2 Business Implications in Q4.0
    • 6.3 Quality Control and Sustainability
    • 6.4 Case Study
    • 6.4.1 Introduction of Case
    • 6.4.2 Descriptive Statistics
    • 6.4.3 Control Charts
    • 6.4.4 Process Capability Analysis
    • 6.5 Conclusion
    • 6.6 References

    6.1 Introduction

    At the end of the 18th century, steam-powered machines were invented, enabling heavy operations to be done more easily. In this manner, the production systems relying on manpower progressed to a machine-power-based system. Later, with the widespread use of electricity, mass production lines were experienced, and it resulted in massive quantities of products. The development of computer systems in the early 1970s opened up a new era in production systems in which processes and machines could be controlled by automation systems. High acceleration in the development of the Internet and new generation technologies, such as Wi-Fi, Internet of Things, robotics, blockchain, and so on, have made machines and robots in the production processes reach a position where they can do their own analysis and make the most coherent decisions by themselves. Furthermore, these technologies have created large networks in which more devices can connect to each other and communicate within the network. During this entire Industrial Revolution period, two factors have perpetually increased: the amount of production and the energy requirement. The increase in the production amount paved the way for the emergence of many companies and the formation of a competitive environment in the market.
  • Statistical Process Control for the Food Industry
    eBook - ePub

    Statistical Process Control for the Food Industry

    A Guide for Practitioners and Managers

    • Sarina A. Lim, Jiju Antony(Authors)
    • 2019(Publication Date)
    • Wiley
      (Publisher)
    4 An Introduction of SPC in the Food Industry: Past, Present and Future

    4.1 Statistical Process Control: A Brief Overview

    Understanding the meaning of Statistical Process Control (SPC ) is vital in operating SPC in the food industry. There have been attempts to expand the concept of SPC, beyond the process monitoring technique.
    SPC has been categorised into several types of definitions such as:
    • technological innovation (Bushe 1988 ; Roberts, Watson, and Oliver 1989 );
    • process management technique (Bissell 1994 );
    • control algorithm (Hryniewicz 1997 );
    • a component of total quality management (TQM ) (Barker 1990 );
    • one of the quality management system in the food industry (Caswell, Bredahl, and Hooker 1998 ).
    • Wallace et al. (2012 ) and Davis and Ryan (2005 ) viewed SPC as a participatory management system – teamwork efforts, employee involvement and enable real‐time decision‐making to be made (Deming 1986 ; Elg, Olsson, and Dahlgaard 2008 ).
    SPC is a powerful collection of problem‐solving tools useful in achieving process stability and improving capability through the reduction of variability
    (Montgomery 2012 )
    The focus of SPC is for the users to understand the variation in values of quality characteristics (Woodall 2000 ). The primary indicator of an effective SPC application is a stable process. The process stability refers to the stability of the underlying probability distribution of a process over time, and these very often can be described as the stability of the distribution parameters overtime (Mahalik and Nambiar 2010 ). The process stability is extremely crucial as it is one of the pre‐requirement to assess the process capability indices determination (Brannstrom‐Stenberg and Deleryd 1999 ; Motorcu and Gullu 2006 ; Sharma and Kharub 2014 ).

    Prior to the assessment of process capability, the process must be ensured to be stable. Process capability indices developed from an unstable process are not reliable.
  • Mastering Statistical Process Control
    • Tim Stapenhurst(Author)
    • 2013(Publication Date)
    • Routledge
      (Publisher)
    PART 1
    An Introduction to the Theory of SPC
    The aim of any type of data analysis is to gain understanding from data. When we collect process performance data we see that it varies. The information in this variation is important to the understanding of how the process is performing and statistical process control (SPC) is primarily the tool for understanding variation:
       SPC is the use of statistically based tools and techniques principally for the management and improvement of processes. The main tool associated with SPC is the control chart.
       A control chart is a plot of a process characteristic, usually through time with statistically determined limits. When used for process monitoring, it helps the user to determine the appropriate type of action to take on the process.
    You may find these two definitions off-putting, and the purpose of this part of the book is to explain them, and also the basic concepts and ideas behind SPC as well as the importance and use of control charts.
    First we explain, briefly, what is meant by the term ‘process’ as it is important to understand how the term is used in the book.
    One of the crucial keys to understanding performance measurement, and hence statistical process control, is variation. If there were no variation there would be no problem: life would be much simpler and more boring. Much of a manager’s work is given over to understanding, managing and controlling variation. This whole book deals with the analysis, understanding and management of variation.
    Unfortunately statistics does come into SPC. Actually, statistics should come into all aspects of running an organisation because statistics is all about understanding data. There are only a few main statistical measures that need to be discussed here, namely the mean, the standard deviation and the range
  • Quality Management for the Technology Sector
    Figure 10-2 . Each step is further explained below.
    Figure 10-2 A Six-Step Process for Statistical Process Control Implementation. These steps assure critical parameters are identified and selected for SPC applications.

    Selecting Processes for Statistical Control

    One of the first steps in implementing statistical process control is realizing that the technique is not appropriate for all processes.
    In our experience, processes ideally suited for statistical process control are repetitive (in the sense that they produce quantities of similar items), have a high inspection content, have higher than desired reject rates, and create items with dimensions or other characteristics that are fairly straightforward to measure. Processes with high inspection content are potential candidates because statistical process control greatly reduces or eliminates inspection. Processes producing parts with high reject rates are candidates because statistical process control, when properly implemented, frequently eliminates defects. Processes that are not producing rejects should also be considered for statistical control, but processes with high reject rates should be considered first.
    Processes that produce items with dimensions or other characteristics that are fairly straightforward to measure are good statistical control candidates because straightforward measurement techniques simplify statistical process control training and acceptance.

    Attributes Versus Variables Data

    The last characteristic listed above for identifying statistical process control candidates (processes producing parts with dimensions or other characteristics that are fairly straightforward to measure) brings us to another issue. This issue concerns attributes and variables data. Variables data are related to measurements with quantifiable values (for example, shaft diameters are measured and recorded with specific values, as shown in Figure 1). Attributes data only reflect a yes or no decision, such as whether an item passed or failed a test. Attributes data are recorded in such terms as pass or fail, go or no go, yes or no, true or false, accept or reject, etc. There are no quantifiable values included with attributes data.
  • Problem Solving and Data Analysis Using Minitab
    eBook - ePub

    Problem Solving and Data Analysis Using Minitab

    A Clear and Easy Guide to Six Sigma Methodology

    • Rehman M. Khan(Author)
    • 2013(Publication Date)
    • Wiley
      (Publisher)

    Chapter 7

    Statistical Process Control

    7.1 The Origins of Statistical Process Control

    Dr Walter A. Shewart is credited as being the father of Statistical Process Control (SPC). It was in the 1920s when Shewart wrote to his boss and proposed his ideas. At that time he worked for the Western Electric Company at the Hawthorne Works. (Incidentally, this is the same Hawthorne Electrical Works where the famous Hawthorne Effect which relates to industrial psychology was recognised.)
    Later William Demming applied the Statistical Control techniques to the production of munitions and essentials during WWII. Demming also worked in postwar Japan applying the same techniques that helped shape Japan as an industrial giant.
    It was not until much later that control charts were used in nonmanufacturing environments, beginning with Computer Software.
    Dr Shewart recognised the importance of reducing variation in manufacturing processes. He also concluded that continual process adjustments by operators would in all likelihood increase variation and result in more defects.
    He looked at problems in terms of common cause and special cause variation. He concluded that every process displays common cause variation and set limits to when the variation was caused by new or additional factors which he called special cause variation.
    Dr Shewart used ±3 StDevs as the control limits to separate common cause variation from special cause variation. This figure shows a typical control chart indicating a process that is in control.

    7.2 Common Cause and Special Cause Variation

    The control chart provides a simple way of detecting special cause variation. Usually the Control Limits that are calculated will be within the tolerance limits of the process. The aim is to react when the control limits are exceeded by investigating and rectifying the special cause. If we do this before we go out of the tolerance limits we avoid defects and waste.
  • Total Manufacturing Assurance
    eBook - ePub

    Total Manufacturing Assurance

    Controlling Product Quality, Reliability, and Safety

    • Douglas Brauer, John Cesarone(Authors)
    • 2022(Publication Date)
    • CRC Press
      (Publisher)
    Lot Sentencing) on the basis of their overall quality, and the vendor is left to worry about the cause of any degradation. Since these techniques are used to decide whether a batch is to be accepted or not, it is known as acceptance sampling.
    The fundamental mathematical science of statistical process control is probability and statistics. In statistical process control, statistical analysis is used to calculate probabilities, and the probabilities enable deriving conclusions as to whether a process is in control (i.e., producing good product).
    Probability is a measure of knowledge about some aspect of the world. When knowledge is complete, the probability of any outcome or event is either one or zero, depending on whether the event has occurred or not. When knowledge is incomplete, the probability is somewhere between zero and one.
    The classic example is tossing a coin. It will land with either heads or tails up and while it is in the air, spinning in a random-incalculable manner. The perfect coin will have assigned a probability of one-half to each of the possible outcomes (i.e., heads up or tails up), because in a truly random coin flip, there will be no way of knowing which will occur or even if one is more likely to occur than the other.
    Of course, with enough good sensors, detailed enough math models, accurate data on air currents in the room, and the precise torques imparted to the coin as it was flipped, it could have been predicted exactly how the coin would land. The probability of the correct outcome would be adjusted to one, the other to zero. Without this knowledge, however, one must be satisfied with a 50% chance of each outcome.
    This is how statistics are used in process control. Just like the tumbling coin, a manufacturing process has some exact state, and some precise number of non-conforming units will be produced. Before the fact, however, this number is not known and the probabilities will have to be assigned. Statistics collected on past events are used to predict the probabilities of future events.
  • Practitioner's Guide to Statistics and Lean Six Sigma for Process Improvements
    • Mikel J. Harry, Prem S. Mann, Ofelia C. De Hodgins, Richard L. Hulbert, Christopher J. Lacke(Authors)
    • 2011(Publication Date)
    • Wiley
      (Publisher)
    control means to keep something within boundaries. Remember, from previous chapters, that a process is any set of conditions or causes, which work together to produce an output or result.
    In summary, this chapter defines a process as the organization of people, procedures, machines, and materials into the work activities needed to produce a specified end result (output). It is a sequence of activities characterized by
    • Measurable inputs
    • Value-added (VA) activities
    • Measurable Outputs
    • Repeatability
    15.4 STATISTICAL CONTROL SYSTEMS
    The term process refers to the operation of a single cause; in a wider sense, it may refer to the operation of a very complex system of causes. This broad definition permits process capability studies to be applied to many activities, including engineering, operating, management, clerical, services, accounting, financial, audit, information technology, systems, communications, sales, marketing, advertising, or other organizations that experience merchandise losses. Key techniques covered in this chapter include
    • Mistakeproofing
    • Process control and the use of control charts
    • Process control charts that were originally used by manufacturing
    15.4.1 Mistakeproofing
    Mistakeproofing or poka-yoke, briefly discussed in Chapter 7, is derived from the Japanese poka (inadvertent error) and yokeru (avoidance). Dr. Shigeo Shingo,1 an industrial engineer at Toyota, developed the poka-yoke concept. It is one of several control concepts where the solution is static as opposed to dynamic control obtained from a closed-loop feedback control system. It was originally named foolproofing , but Shigeo Shingeo was concerned that some workers might feel offended and the methodology was renamed. Poka -yoke
  • Handbook of Semiconductor Manufacturing Technology
    • Yoshio Nishi, Robert Doering(Authors)
    • 2017(Publication Date)
    • CRC Press
      (Publisher)
    Wiley Encyclopedia of Electrical and Electronics Engineering, Vol. 19, 59-86. New York: Wiley, 1999.
    2. Two-Year Keithley Study Eyes Process Control, WaferNews , Vol. 4.29, 3, 6. 28 July 1977.
    3. Box, G., and A. Luceno. Statistical Control by Monitoring and Feedback Adjustment. Wiley Series in Probability and Statistics, Wiley, 1997. Also, course from University of Wisconsin-Madison College of Engineering, Feedback Adjustment for SPC, How to Maximize Process Capability Using Feedback Adjustment, Box, G., J. Hunter, and S. Bisgaard.
    4. Seborg, D. E., T. F. Edgar, and D. A. Mellichamp. Process Dynamics and Control . New York: Wiley, 1989.
    5. Vander Wiel, S. A., W. T. Tucker, F. W. Faltin, and N. Doganaksoy. “Algorithmic Statistical Process Control: Concepts and an Application.” Technometrics, 34, no. 3 (1992): 286-97.
    6. Tucker, W. T., and F. W. Faltin. “Algorithmic Statistical Process Control: An Elaboration” Technometrics 35, no. 4 (1993): 363-75.
    7. MacGregor, J. F. “Interfaces between Process Control and On-Line Statistical Process Control.” AIChE Comput. Syst. Technol. Div. Commun. 10, no. 2 (1987): 9-20.
    8. Box, G. and T. Kramer. “Statistical Process Monitoring and Feedback Adjustment: A Discussion” Technometrics, 34, no. 3 (1992): 251-67.
    9. Hoerl, R. W. and A. C. Palm. “Discussion: Integrating SPC and APC” Technometrics 34, no. 3 (1992): 268-72.
    10. MacGregor, J. F. “Discussion” Technometrics, 34, no. 3 (1992): 273-5.
    11. Tucker, W. T. “Discussion” Technometrics, 34, no. 3 (1992): 275-7.
    12. Vander Wile, S. A., and S. B. Vardeman. “Discussion” Technometrics, 34, no. 3 (1992): 278-81.
    13. Wardrop, D. M., and C. E. Garcia. “Discussion” Technometrics, 34, no. 3 (1992): 281-2.
    14. Box, G., and T. Kramer. “Response” Technometrics, 34, no. 3 (1992): 282-5.
    15. Muthukrishnan, S., and J. Stefani. “SCFab Model-Based Process Control Methodology: Development and Deployment for Manufacturing Excellence.” TI Tech. J.
  • Statistics in Engineering
    eBook - ePub

    Statistics in Engineering

    With Examples in MATLAB® and R, Second Edition

    • Andrew Metcalfe, David Green, Tony Greenfield, Mayhayaudin Mansor, Andrew Smith, Jonathan Tuke(Authors)
    • 2019(Publication Date)
    Acceptance sampling could be used to control the AOQL, but the company aims to deal with regular suppliers who can be relied upon to deliver high quality components. Once a supplier has demonstrated an ability to consistently meet the specification, acceptance sampling is only occasionally implemented, or dispensed with. However, a single item is inspected from every delivery to ensure that the delivery corresponds to the order.
    Another criticism of acceptance sampling is that it is based on the notion that a small proportion of defects is acceptable, and if the proportion is as small as 1 in 1 000 (1 000 ppm) the sample size for an effective acceptance sampling scheme is unreasonably large. [Deming, 2000] says that acceptance sampling techniques “guarantee that some customers will get defective products” and also bemoans the resources used to implement them. Nevertheless, acceptance sample may be useful when dealing with new suppliers if defects can be defined so that a small proportion is acceptable.
    10.6    Statistical quality control charts
    Statistical quality control (SQC) charts are used to monitor processes that are generally in statistical control, and to provide early warning of any special cause variation that affects the process. They plot statistics that represent the quality of the items produced, obtained from samples over time. The plot has a region which corresponds to the process appearing to be in statistical control, and a region or regions which indicate that some action needs to be taken. The action need not be as drastic as stopping a production line, and it may just be to monitor the process more closely, but there is little benefit to be obtained from SQC charts if points in the action region are ignored. Walter A Shewhart is credited with introducing the idea at the Western Electric Company’s factory in Cicero, Illinois in 1924.
    A common feature of SQC charts is that there are target values for the variables that are being monitored and that the standard deviations of the variables are known. The standard deviations are estimated from records of the process when it has been deemed to be in statistical control, and have already been used to demonstrate capability.
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.