Thursday, November 15, 2007
SPC Control Limits (LCL and UCL)
Nevertheless, Control Limits in SPC, from the founder Shewhart's point of view, should not be confused with Confidence Interval in Probability. Notably, it is a mistake to use Student t-distribution Confidence Interval to estimate SPC Control Limits.
The basics of Statistical Process Control (SPC) is to make the Process Control Limits fit within the Specifications Limits.
Thursday, November 1, 2007
Specifications Limits (LSL and USL)
The basics of Statistical Process Control (SPC) is to make the Process Control Limits fit within the Specifications Limits.
Non-Normal Distributions in the Real World
That's a big myth: Normal Law especially in Manufacturing is very rare; in fact that's the very foundation of Shewhart's talk about a "controlled process": if Normal Law were magic, there would be no need for him to invent this latter concept in his book "Statistical Method from the Viewpoint of Quality Control".
It is astonishing that some practionners in Quality field do rediscover the wheel even after they have heard about Walter A. Shewhart - whereas others seem to have totally failed to even engage in the way to do so - I reproduce below an article By Thomas Pyzdek.
I'm not affiliated with him - I don't know him whatsoever - and I only posted his opinions here to illustrate what I'm claiming above as well as to serve as a reference from my other blog's article about "Shewhart/Deming Statistical Process Control vs Six Sigma".
Non-Normal Distributions in the Real World
Copyright © 2000 by Thomas Pyzdek, all rights reserved
Reproduction allowed if no changes are made to content
One day, early in my quality career, I was approached by my friend Wayne, the manager of our galvanizing plant.
'Tom," he began, "I've really been pushing quality in my area lately, and everyone's involved. We're currently working on a problem with plating thickness. Your reports always show a 3-percent to 7-percent reject rate, and we want to drive that number down to zero."
I, of course, was pleased. The galvanizing area had been the company's perennial problem child. "How can I help?" I asked.
"We've been trying to discover the cause of the low thicknesses, but we're stumped. I want to show copies of the quality reports to the team so they can see what was happening with the process when the low thicknesses were produced."
"No problem:' I said, "I'll have them for you this afternoon."
Wayne left, and I went to my galvanizing reports file. The inspection procedure called for seven light poles to be sampled and plotted each hour. Using the reports, I computed the daily average and standard deviation by hand (this was before the age of personal computers). Then, using a table of normal distribution areas. I found the estimated percent below the low specification limit. This number had been reported to Wayne and a number of others. As Wayne had said, the rate tended to be between 3 percent and 7 percent.
I searched through hundreds of galvanizing reports, but I didn't find a single thickness below the minimum. My faith in the normal distribution wasn't shaken, however. I concluded that the operators must be "adjusting" their results by not recording out-of-tolerance thicknesses. I set out for the storage yard, my thickness gage in hand, to prove my theory.
Hundreds of parts later, I admitted defeat. I simply couldn't find any thickness readings below the minimum requirement. The hard-working galvanizing teams met this news with shock and dismay.
"How could you people do this to us?" Wayne asked.
This embarrassing experience led me to begin a personal exploration of just how common normal distributions really are. After nearly two decades of research involving thousands of real-world manufacturing and nonmanufacturing operations, I have an announcement to make: Normal distributions are not the norm.
You can easily prove this by collecting data from live processes and evaluating it with an open mind. In fact, the early quality pioneers (such as Walter A. Shewhart) were fully aware of the scarcity of normally distributed data. Today, the prevailing wisdom seems to say, "If it ain't normal, something's wrong." That's just not so.
For instance, most business processes don't produce normal distributions. There are many reasons why this is so. One important reason is that the objective of most management and engineering activity is to control natural processes tightly, eliminating sources of variation whenever possible. This control often results in added value to the customer. Other distortions occur when we try to measure our results. Some examples of "de-normalizing" activities include human behavior patterns, physical laws and inspection.
Human Behavior Patterns
Figure 1 shows a histogram of real data from a billing process. A control chart of days-to-pay (i.e., the number of days customers take to pay their bills) for nonprepaid invoices showed statistical control. The histogram indicates that some customers like to prepay, thus eliminating the work associated with tracking accounts payable. Customers who don't prepay tend to send payments that arrive just after the due date. There is a second, smaller spike after statements are sent, then a gradual drop-off. The high end is unbounded because a few of the customers will never pay their bills. This pattern suggests a number of possible process improvements. hut the process will probably never produce a normally distributed result. Human behavior is rarely random, and processes involving human behavior are rarely normal.
Physical Laws
Nature doesn't always follow the "Normal Law" either. Natural phenomena often produce distinctly non-normal patterns. The hot-dip galvanizing process discussed previously is an example. A metallurgist described the process to me (but too late, alas, to prevent the aforementioned debacle) as the creation of a zinc-iron alloy at the boundary. The alloy forms when the base material reaches the temperature of the molten zinc. Pure zinc will accumulate after the alloy layer has formed. However, if the part is removed before the threshold temperature is reached, no zinc will adhere to the base metal. Such parts are so obviously defective that they're never made.
Thus, the distribution is bounded on the low side by the alloy-layer thickness, but (for all practical purposes) unbounded on the high side because pure zinc will accumulate on top of the alloy layer as long as the part remains submerged. Figure 2 shows the curve for the process - a non-normal curve.
Inspection
Sometimes inspection itself can create non-normal data. ANSI Y14.5, a standard for dimensioning and tolerancing used by aerospace and defense contractors, describes a concept called "true position." The true position of a feature is found by converting an X and Y deviation from target to a radial deviation and multiplying by two. Even if X and Y are normally distributed (of course, they usually aren't), the true position won't be. True position is bounded at zero and the shape often depends solely on the standard deviation.
Many other inspection procedures create non-normal distributions from otherwise normal data. Perpendicularity might be normally distributed if the actual angle were measured and recorded. Quite often, though, perpendicularity is measured as the deviation from 90 degrees, with 88 degrees and 92 degrees both being shown as 2 degrees from 90 degrees. The shape of the resulting distribution varies depending on the mean and standard deviation. Its shape can range from a normal curve to a decidedly non-normal curve. This apparent discrepancy also applies to flatness, camber and most other form callouts in ANSI Y14.5. The shape of the curve tells you nothing about your control of the process.
Implications
At this point, a purist might say, "So what?" After all, any model is merely an abstraction of reality and in error to some extent. Nevertheless, when the error is so large that it has drastic consequences, the model should be re-evaluated and perhaps discarded. Such is often the case with the normal model.
Process capability analysis (PCA) is a procedure used to predict the long-term performance of statistically controlled processes. Virtually all PCA techniques assume that the process distribution is normal. If it isn't, PCA methods, such as Cpk, may show an incapable process as capable, or vice versa. Such methods may predict high reject rates even though no rejects ever appear (as with the galvanizing process discussed earlier) or vice versa.
If you're among the enlightened few who have abandoned the use of "goal-post tolerances" and PCA, you'll find that assuming normality hampers your efforts at continuous improvement. If the process distribution is skewed, the optimal setting (or target) will be somewhere other than the center of the engineering tolerance, but you'll never find it if you assume normality. Your quality-improvement plan must begin with a clear understanding of the process and its distribution.
Failure to understand non-normality leads to tampering, increased reject rates, sub-optimal process settings, failure to detect special causes, missed opportunities for improvement, and many other problems. The result is loss of face, loss of faith in SPC in general, and strained customer-supplier relations.
Learning without understanding
-- W. Edwards Deming - Quality Founder - (see Deming's Quotes)
Like in Physics, many people seem to learn Statistical Process Control through Six Sigma without really understanding. This flaw pattern in conventional education has been underlined in Richard Feynman's book chapter untitled "Learning without understanding":
"I often liked to play tricks on people when I was at MlT. One time, in mechanical drawing class, some joker picked up a French curve (a piece of plastic for drawing smooth curves - a curly, funny-looking thing) and said, "I wonder if the curves on this thing have some special formula?"
I thought for a moment and said, "Sure they do. The curves are very special curves. Lemme show ya," and I picked up my French curve and began to turn it slowly. "The French curve is made so that at the lowest point on each curve, no matter how you turn it, the tangent is horizontal."
All the guys in the class were holding their French curve up at different angles, holding their pencil up to it at the lowest point and laying it along, and discovering that, sure enough, the tangent is horizontal. They were all excited by this "discovery" even though they had already gone through a certain amount of calculus and had already "learned" that the derivative (tangent) of the minimum (lowest point) of any curve is zero (horizontal). They didn't put two and two together. They didn't even know what they "knew."
I don't know what's the matter with people: they don't learn by understanding, they learn by some other way-by rote, or something. Their knowledge is so fragile."
Shewhart/Deming Statistical Process Control vs Six Sigma
-- W. Edwards Deming - Quality Founder - (see Deming's Quotes)
There are a number of critics around Six Sigma at least around its marketing hype:
- Juran's interview
- Desperately Seeking Sigma
Juran said:
"I don't like the hype, and I don't think the hype is going to last. Something that is as successful as the improvement process gets label after label after label. Those labels come and go, but the basic concept stays. There will be some marketer that finds a new label, finds a way to make that a fad and off he'll go, doing the same thing we did before under a new label."
Apart from the hype critics, Six Sigma (associated with Motorola) does not focus on the background Philosophy of Statistical Process Control founded by Walter A. Shewhart - later popularized by his student W. Edwards Deming - at least in its understanding or practice by the community of users, vendors and consultants. This risks not to be changed with Lean Six Sigma.
Why? Though Common and special causes seem to be listed in the majority of Six Sigma glossaries, people seem to "learn without understanding" - to borrow the title of a chapter (excerpt here) from Richard Feynman's book. Mostly, many presume that their process under study follows the Normal Law Distribution. To illustrate the case, it is interesting to read the testimony of a quality practitionner and consultant in one of his article untitled "Non-Normal Distributions in the Real World" where he said:
"After nearly two decades of research involving thousands of real-world manufacturing and nonmanufacturing operations, I have an announcement to make: Normal distributions are not the norm.
You can easily prove this by collecting data from live processes and evaluating it with an open mind. In fact, the early quality pioneers (such as Walter A. Shewhart) were fully aware of the scarcity of normally distributed data. Today, the prevailing wisdom seems to say, “If it ain’t normal, something’s wrong.” That’s just not so."
In fact the flaw seems rooted in the core definition of Six Sigma as understood by some people like here:
Below is Bain’s explanation of Six Sigma:
“Sigma” is a measure of statistical variation. Six Sigma indicates near perfection and is a rigorous operating methodology aimed to ensure complete customer satisfaction by ingraining a culture of excellence, responsiveness and accountability within an organization. Specifically, it requires the delivery of defect-free products or services 99.9997 percent of the time. That means that out of a million products or service experiences, only 3 would fail to meet the customer’s expectations. (The average company runs at around Three Sigma, or 66,800 errors per million.)
There is no need of Six Sigma to become a "Black Belt".
Process Capability
Wednesday, October 31, 2007
Common and Special Causes
"La nature a sans doute ses habitudes, provenant du retour des causes, mais ce n'est que la plupart du temps. C'est pourquoi, ne peut-on pas objecter qu'une nouvelle expérience puisse s'écarter un tant soit peu de la loi de toutes les précédentes, du fait de la variabilité même des choses ? De nouvelles maladies se répandent souvent sur le genre humain et par conséquent quelque soit le nombre de morts dont vous avez fait l'expérience ce n'est pas pour autant que vous avez établi les limites des choses de la nature au point qu'elle ne puisse en varier dans le futur."
"Nature has established patterns originating in the return of events but only for the most part. Therefore, can't we argue that a new experience could deviate from the law of all the previous ones, even a little bit, because of the very essence of variability of things ? New illnesses often flood the human race, so that no matter how many experiments you have done, you have not thereby established a limit on the nature of events so that in the future they could not vary."
Later, John Maynard Keynes directly refers to Leibniz in his essay on "Probability in relation to the theory of knowledge". According to a Cambridge Journal's article, "Keynes's Treatise on Probability contains some quite unusual concepts, such as non-numerical probabilities and the ‘weights of the arguments’ that support probability judgements. Their controversial interpretation gave rise to a huge literature about ‘what Keynes really did mean’, also because Keynes's later views in macroeconomics ultimately rest on his ideas on uncertainty and expectations formation". But what Keynes really means was just what he once told clearly:
"By uncertain knowledge … I do not mean merely to distinguish what is known for certain from what is only probable. The game of roulette is not subject, in this sense, to uncertainty ...
The sense in which I am using the term is that in which the prospect of a European war is uncertain, or the price of copper and the rate of interest twenty years hence, or the obsolescence of a new invention … About these matters there is no scientific basis on which to form any calculable probability whatever. We simply do not know!"
At the same time, during the 1920s, Walter A. Shewhart, statistician and engineer, was commissioned to improve the quality of telephones manufactured by Bell Laboratories.
Shewhart framed the problem in terms of Common and Special-Causes of variation. Though Shewhart may not have been the first to reveal this concept, he is the first who has established an operational mean to distinguish between the two: on May 16, 1924, he wrote an internal memo introducing Statistical Process Control with a Control Chart as a tool for Continuous Process Improvment.
Quality Gamebox
Quality Gamebox 2.0 is a collection of simulations and experiments that is both entertaining and educational. Each game demonstrates and explains classic concepts that are useful in quality management and SPC. Quality Gamebox uses colorful graphics, animation, and sound to enhance the learning experience.
Deming’s red bead experiment demonstrates the theory of variation while his funnel experiment shows the effects of tampering. John McConnell’s dice experiment illustrates the benefits of reducing variation. The central limit theorem shows the effect of sampling, and the quincunx explains the normal curve.
Quality Gamebox is useful for training staff members, consultants, or anyone who wants to illustrate basic statistical concepts.
Sunday, October 21, 2007
Really undestanding Statistical Process Control (SPC)
"Variation from common-cause systems should be left to chance, but special causes of variation should be identified and eliminated."
So SPC indicates when an action should be taken in a process, but it also indicates when NO action should be taken. An example is a person who would like to maintain a constant body weight and takes weight measurements weekly. A person who does not understand SPC concepts might start dieting every time his or her weight increased, or eat more every time his or her weight decreased. This type of action could be harmful and possibly generate even more variation in body weight. SPC would account for normal weight variation and better indicate when the person is in fact gaining or losing weight.
2°) The rule should not be misinterpreted as meaning that variation from common causes should be ignored. Rather, common-cause variation is explored "off-line." That is, we look for long-term process improvements to address common-cause variation.
3°) A controlled process isn't necessarily a sign of good management, nor is an out-of-control process necessarily producing non-conforming product.
Statistical Process Control (SPC)
"A phenomenon will be said to be controlled when, through the use of past experience, we can predict, at least within limits, how the phenomenon may be expected to vary in the future. Here it is understood that prediction within limits means that we can state, at least approximately, the probability that the observed phenomenon will fall within the given limits."
This definition means that control is not equivalent to a complete absence of variation but rather that the system is in a state where variation is predictable within some fixed limit.
Shewhart also realized that frequent process-adjustment in reaction to non-conformance actually increased variation and degraded quality. That's why he expressed the fundamental rule of statistical process control in this way:
"Variation from common-cause systems should be left to chance, but special causes of variation should be identified and eliminated."
Without Statistical Process Control guidance, there could be endless debate over whether special or common causes were to blame for variability. This is crucial as the type of action needed to reduce the variability in each case are of a different nature.
Six Sigma
SPC Book - Statistical Process Control: The Deming Paradigm and Beyond
While the common practice of Quality Assurance aims to prevent bad units from being shipped beyond some allowable proportion, statistical process control (SPC) ensures that bad units are not created in the first place. Its philosophy of continuous quality improvement, to a great extent responsible for the success of Japanese manufacturing, is rooted in a paradigm as process-oriented as physics, yet produces a friendly and fulfilling work environment. The first edition of this groundbreaking text showed that the SPC paradigm of W. Edwards Deming was not at all the same as the Quality Control paradigm that has dominated American manufacturing since World War II. Statistical Process Control: The Deming Paradigm and Beyond, Second Edition reveals even more of Deming’s philosophy and provides more techniques for use at the managerial level. Explaining that CEOs and service industries need SPC at least as much as production managers, it offers precise methods and guidelines for their use. Using the practical experience of the authors working both in America and Europe, this book shows how SPC can be implemented in a variety of settings, from health care to manufacturing. It also provides you with the necessary technical background through mathematical and statistical appendices. According to the authors, companies with managers who have adopted the philosophy of statistical process control tend to survive. Those with managers who do not are likely to fail. In which group will your company be?
SPC Book - Statistical Method from the Viewpoint of Quality Control by Walter A. Shewhart
Nearly all of the books about Statistical Process Control are quite awful. They contain errors, poor examples, and promote bad practices. Except for a few that may still exist, there is only one book one can recommend for sure: it's "Statistical Method from the Viewpoint of Quality Control" by Walter A. Shewhart, the original founder of Industrial Quality Management.
For superficial readers and students in statistics, this book could be easily misunderstood as an other book on Probability Calculations whereas it does fundamentally deal with the Epistemology of Probability and Mathematics in the Real World of Manufacturing and Science.
The current publisher's description on Amazon doesn't say much. It's better to read the original publisher's description of the book in the 1920s:
The application of statistical methods in mass production makes possible the most efficient use of raw materials and manufacturing processes, economical production, and the highest standards of quality for manufactured goods. In this classic volume, based on a series of ground-breaking lectures given to the Graduate School of the Department of Agriculture in 1938, Dr Shewhart illuminates the fundamental principles and techniques basic to the efficient use of statistical method in attaining statistical control, establishing tolerance limits, presenting data, and specifying accuracy and precision.
In the first chapter, devoted to statistical control, the author broadly defines the three steps in quality control: specification, production and inspection; he then outlines the historical background of quality control. This is followed by a rigorous discussion of the physical and mathematical states of statistical control, statistical control as an operation, the significance of statistical control and the future of statistics in mass production.
Chapter II offers a thought-provoking treatment of the problem of establishing limits of variability, including the meaning of tolerance limits, establishing tolerance limits in the simplest cases and in practical cases, and standard methods of measuring. Chapter III explores the presentation of measurements of physical properties and constants. Among the topics considered are measurements presented as original data, characteristics of original data, summarizing original data (both by symmetric functions and by Chebyshev's theorem), measurement presented as meaningful predictions, and measurement presented as knowledge.
Finally, Dr Shewhart deals with the problem of specifying accuracy and precision - the meaning of accuracy and precision, operational meaning, verifiable procedures, minimum quantity of evidence needed for forming a judgment and more.
In this book Shewhart asks:-
What can statistical practice, and science in general, learn from the experience of industrial quality control?
He wrote in this book:-
The definition of random in terms of a physical operation is notoriously without effect on the mathematical operations of statistical theory because so far as these mathematical operations are concerned random is purely and simply an undefined term. The formal and abstract mathematical theory has an independent and sometimes lonely existence of its own. But when an undefined mathematical term such as random is given a definite operational meaning in physical terms, it takes on empirical and practical significance. Every mathematical theorem involving this mathematically undefined concept can then be given the following predictive form: If you do so and so, then such and such will happen.
Statistical Process Control Softwares
SPC for MS Excel is used to generate and easily update SPC charts and to perform other statistical functions from Microsoft Excel spreadsheets. This affordable software is easy to learn and easy to use and fits the needs of the SPC novice or SPC expert. It is the premier Excel-based SPC program. We have reached this position by listening to what our users say they need. This product has been used around the world for more than a decade. Countries include the United States, Canada, Mexico, United Kingdom, Ireland, Spain, Greece, Hong Kong, Turkey, Indonesia, and Vietnam. It is a key part of many manufacturing and service organizations process improvement efforts.