Control Limits define the predictable process variations. In practice it cannot be known without some associated probability.
Nevertheless, Control Limits in SPC, from the founder Shewhart's point of view, should not be confused with Confidence Interval in Probability. Notably, it is a mistake to use Student t-distribution Confidence Interval to estimate SPC Control Limits.
The basics of Statistical Process Control (SPC) is to make the Process Control Limits fit within the Specifications Limits.
Thursday, November 15, 2007
Thursday, November 1, 2007
Specifications Limits (LSL and USL)
Specification Limits define the target of the product variations.
The basics of Statistical Process Control (SPC) is to make the Process Control Limits fit within the Specifications Limits.
The basics of Statistical Process Control (SPC) is to make the Process Control Limits fit within the Specifications Limits.
Labels:
control chart,
statistical process control
Non-Normal Distributions in the Real World
The existence of Normal Law is based on the Central Limit Theorem, that's probably the reason why the huge majority of people - after they have followed some Statistics course at their Business or Engineering High School - believe that in the Real World, Normal Law is so common.
That's a big myth: Normal Law especially in Manufacturing is very rare; in fact that's the very foundation of Shewhart's talk about a "controlled process": if Normal Law were magic, there would be no need for him to invent this latter concept in his book "Statistical Method from the Viewpoint of Quality Control".
It is astonishing that some practionners in Quality field do rediscover the wheel even after they have heard about Walter A. Shewhart - whereas others seem to have totally failed to even engage in the way to do so - I reproduce below an article By Thomas Pyzdek.
I'm not affiliated with him - I don't know him whatsoever - and I only posted his opinions here to illustrate what I'm claiming above as well as to serve as a reference from my other blog's article about "Shewhart/Deming Statistical Process Control vs Six Sigma".
Non-Normal Distributions in the Real World
Copyright © 2000 by Thomas Pyzdek, all rights reserved
Reproduction allowed if no changes are made to content
One day, early in my quality career, I was approached by my friend Wayne, the manager of our galvanizing plant.
'Tom," he began, "I've really been pushing quality in my area lately, and everyone's involved. We're currently working on a problem with plating thickness. Your reports always show a 3-percent to 7-percent reject rate, and we want to drive that number down to zero."
I, of course, was pleased. The galvanizing area had been the company's perennial problem child. "How can I help?" I asked.
"We've been trying to discover the cause of the low thicknesses, but we're stumped. I want to show copies of the quality reports to the team so they can see what was happening with the process when the low thicknesses were produced."
"No problem:' I said, "I'll have them for you this afternoon."
Wayne left, and I went to my galvanizing reports file. The inspection procedure called for seven light poles to be sampled and plotted each hour. Using the reports, I computed the daily average and standard deviation by hand (this was before the age of personal computers). Then, using a table of normal distribution areas. I found the estimated percent below the low specification limit. This number had been reported to Wayne and a number of others. As Wayne had said, the rate tended to be between 3 percent and 7 percent.
I searched through hundreds of galvanizing reports, but I didn't find a single thickness below the minimum. My faith in the normal distribution wasn't shaken, however. I concluded that the operators must be "adjusting" their results by not recording out-of-tolerance thicknesses. I set out for the storage yard, my thickness gage in hand, to prove my theory.
Hundreds of parts later, I admitted defeat. I simply couldn't find any thickness readings below the minimum requirement. The hard-working galvanizing teams met this news with shock and dismay.
"How could you people do this to us?" Wayne asked.
This embarrassing experience led me to begin a personal exploration of just how common normal distributions really are. After nearly two decades of research involving thousands of real-world manufacturing and nonmanufacturing operations, I have an announcement to make: Normal distributions are not the norm.
You can easily prove this by collecting data from live processes and evaluating it with an open mind. In fact, the early quality pioneers (such as Walter A. Shewhart) were fully aware of the scarcity of normally distributed data. Today, the prevailing wisdom seems to say, "If it ain't normal, something's wrong." That's just not so.
For instance, most business processes don't produce normal distributions. There are many reasons why this is so. One important reason is that the objective of most management and engineering activity is to control natural processes tightly, eliminating sources of variation whenever possible. This control often results in added value to the customer. Other distortions occur when we try to measure our results. Some examples of "de-normalizing" activities include human behavior patterns, physical laws and inspection.
Human Behavior Patterns
Figure 1 shows a histogram of real data from a billing process. A control chart of days-to-pay (i.e., the number of days customers take to pay their bills) for nonprepaid invoices showed statistical control. The histogram indicates that some customers like to prepay, thus eliminating the work associated with tracking accounts payable. Customers who don't prepay tend to send payments that arrive just after the due date. There is a second, smaller spike after statements are sent, then a gradual drop-off. The high end is unbounded because a few of the customers will never pay their bills. This pattern suggests a number of possible process improvements. hut the process will probably never produce a normally distributed result. Human behavior is rarely random, and processes involving human behavior are rarely normal.
Physical Laws
Nature doesn't always follow the "Normal Law" either. Natural phenomena often produce distinctly non-normal patterns. The hot-dip galvanizing process discussed previously is an example. A metallurgist described the process to me (but too late, alas, to prevent the aforementioned debacle) as the creation of a zinc-iron alloy at the boundary. The alloy forms when the base material reaches the temperature of the molten zinc. Pure zinc will accumulate after the alloy layer has formed. However, if the part is removed before the threshold temperature is reached, no zinc will adhere to the base metal. Such parts are so obviously defective that they're never made.
Thus, the distribution is bounded on the low side by the alloy-layer thickness, but (for all practical purposes) unbounded on the high side because pure zinc will accumulate on top of the alloy layer as long as the part remains submerged. Figure 2 shows the curve for the process - a non-normal curve.
Inspection
Sometimes inspection itself can create non-normal data. ANSI Y14.5, a standard for dimensioning and tolerancing used by aerospace and defense contractors, describes a concept called "true position." The true position of a feature is found by converting an X and Y deviation from target to a radial deviation and multiplying by two. Even if X and Y are normally distributed (of course, they usually aren't), the true position won't be. True position is bounded at zero and the shape often depends solely on the standard deviation.
Many other inspection procedures create non-normal distributions from otherwise normal data. Perpendicularity might be normally distributed if the actual angle were measured and recorded. Quite often, though, perpendicularity is measured as the deviation from 90 degrees, with 88 degrees and 92 degrees both being shown as 2 degrees from 90 degrees. The shape of the resulting distribution varies depending on the mean and standard deviation. Its shape can range from a normal curve to a decidedly non-normal curve. This apparent discrepancy also applies to flatness, camber and most other form callouts in ANSI Y14.5. The shape of the curve tells you nothing about your control of the process.
Implications
At this point, a purist might say, "So what?" After all, any model is merely an abstraction of reality and in error to some extent. Nevertheless, when the error is so large that it has drastic consequences, the model should be re-evaluated and perhaps discarded. Such is often the case with the normal model.
Process capability analysis (PCA) is a procedure used to predict the long-term performance of statistically controlled processes. Virtually all PCA techniques assume that the process distribution is normal. If it isn't, PCA methods, such as Cpk, may show an incapable process as capable, or vice versa. Such methods may predict high reject rates even though no rejects ever appear (as with the galvanizing process discussed earlier) or vice versa.
If you're among the enlightened few who have abandoned the use of "goal-post tolerances" and PCA, you'll find that assuming normality hampers your efforts at continuous improvement. If the process distribution is skewed, the optimal setting (or target) will be somewhere other than the center of the engineering tolerance, but you'll never find it if you assume normality. Your quality-improvement plan must begin with a clear understanding of the process and its distribution.
Failure to understand non-normality leads to tampering, increased reject rates, sub-optimal process settings, failure to detect special causes, missed opportunities for improvement, and many other problems. The result is loss of face, loss of faith in SPC in general, and strained customer-supplier relations.
That's a big myth: Normal Law especially in Manufacturing is very rare; in fact that's the very foundation of Shewhart's talk about a "controlled process": if Normal Law were magic, there would be no need for him to invent this latter concept in his book "Statistical Method from the Viewpoint of Quality Control".
It is astonishing that some practionners in Quality field do rediscover the wheel even after they have heard about Walter A. Shewhart - whereas others seem to have totally failed to even engage in the way to do so - I reproduce below an article By Thomas Pyzdek.
I'm not affiliated with him - I don't know him whatsoever - and I only posted his opinions here to illustrate what I'm claiming above as well as to serve as a reference from my other blog's article about "Shewhart/Deming Statistical Process Control vs Six Sigma".
Non-Normal Distributions in the Real World
Copyright © 2000 by Thomas Pyzdek, all rights reserved
Reproduction allowed if no changes are made to content
One day, early in my quality career, I was approached by my friend Wayne, the manager of our galvanizing plant.
'Tom," he began, "I've really been pushing quality in my area lately, and everyone's involved. We're currently working on a problem with plating thickness. Your reports always show a 3-percent to 7-percent reject rate, and we want to drive that number down to zero."
I, of course, was pleased. The galvanizing area had been the company's perennial problem child. "How can I help?" I asked.
"We've been trying to discover the cause of the low thicknesses, but we're stumped. I want to show copies of the quality reports to the team so they can see what was happening with the process when the low thicknesses were produced."
"No problem:' I said, "I'll have them for you this afternoon."
Wayne left, and I went to my galvanizing reports file. The inspection procedure called for seven light poles to be sampled and plotted each hour. Using the reports, I computed the daily average and standard deviation by hand (this was before the age of personal computers). Then, using a table of normal distribution areas. I found the estimated percent below the low specification limit. This number had been reported to Wayne and a number of others. As Wayne had said, the rate tended to be between 3 percent and 7 percent.
I searched through hundreds of galvanizing reports, but I didn't find a single thickness below the minimum. My faith in the normal distribution wasn't shaken, however. I concluded that the operators must be "adjusting" their results by not recording out-of-tolerance thicknesses. I set out for the storage yard, my thickness gage in hand, to prove my theory.
Hundreds of parts later, I admitted defeat. I simply couldn't find any thickness readings below the minimum requirement. The hard-working galvanizing teams met this news with shock and dismay.
"How could you people do this to us?" Wayne asked.
This embarrassing experience led me to begin a personal exploration of just how common normal distributions really are. After nearly two decades of research involving thousands of real-world manufacturing and nonmanufacturing operations, I have an announcement to make: Normal distributions are not the norm.
You can easily prove this by collecting data from live processes and evaluating it with an open mind. In fact, the early quality pioneers (such as Walter A. Shewhart) were fully aware of the scarcity of normally distributed data. Today, the prevailing wisdom seems to say, "If it ain't normal, something's wrong." That's just not so.
For instance, most business processes don't produce normal distributions. There are many reasons why this is so. One important reason is that the objective of most management and engineering activity is to control natural processes tightly, eliminating sources of variation whenever possible. This control often results in added value to the customer. Other distortions occur when we try to measure our results. Some examples of "de-normalizing" activities include human behavior patterns, physical laws and inspection.
Human Behavior Patterns
Figure 1 shows a histogram of real data from a billing process. A control chart of days-to-pay (i.e., the number of days customers take to pay their bills) for nonprepaid invoices showed statistical control. The histogram indicates that some customers like to prepay, thus eliminating the work associated with tracking accounts payable. Customers who don't prepay tend to send payments that arrive just after the due date. There is a second, smaller spike after statements are sent, then a gradual drop-off. The high end is unbounded because a few of the customers will never pay their bills. This pattern suggests a number of possible process improvements. hut the process will probably never produce a normally distributed result. Human behavior is rarely random, and processes involving human behavior are rarely normal.
Physical Laws
Nature doesn't always follow the "Normal Law" either. Natural phenomena often produce distinctly non-normal patterns. The hot-dip galvanizing process discussed previously is an example. A metallurgist described the process to me (but too late, alas, to prevent the aforementioned debacle) as the creation of a zinc-iron alloy at the boundary. The alloy forms when the base material reaches the temperature of the molten zinc. Pure zinc will accumulate after the alloy layer has formed. However, if the part is removed before the threshold temperature is reached, no zinc will adhere to the base metal. Such parts are so obviously defective that they're never made.
Thus, the distribution is bounded on the low side by the alloy-layer thickness, but (for all practical purposes) unbounded on the high side because pure zinc will accumulate on top of the alloy layer as long as the part remains submerged. Figure 2 shows the curve for the process - a non-normal curve.
Inspection
Sometimes inspection itself can create non-normal data. ANSI Y14.5, a standard for dimensioning and tolerancing used by aerospace and defense contractors, describes a concept called "true position." The true position of a feature is found by converting an X and Y deviation from target to a radial deviation and multiplying by two. Even if X and Y are normally distributed (of course, they usually aren't), the true position won't be. True position is bounded at zero and the shape often depends solely on the standard deviation.
Many other inspection procedures create non-normal distributions from otherwise normal data. Perpendicularity might be normally distributed if the actual angle were measured and recorded. Quite often, though, perpendicularity is measured as the deviation from 90 degrees, with 88 degrees and 92 degrees both being shown as 2 degrees from 90 degrees. The shape of the resulting distribution varies depending on the mean and standard deviation. Its shape can range from a normal curve to a decidedly non-normal curve. This apparent discrepancy also applies to flatness, camber and most other form callouts in ANSI Y14.5. The shape of the curve tells you nothing about your control of the process.
Implications
At this point, a purist might say, "So what?" After all, any model is merely an abstraction of reality and in error to some extent. Nevertheless, when the error is so large that it has drastic consequences, the model should be re-evaluated and perhaps discarded. Such is often the case with the normal model.
Process capability analysis (PCA) is a procedure used to predict the long-term performance of statistically controlled processes. Virtually all PCA techniques assume that the process distribution is normal. If it isn't, PCA methods, such as Cpk, may show an incapable process as capable, or vice versa. Such methods may predict high reject rates even though no rejects ever appear (as with the galvanizing process discussed earlier) or vice versa.
If you're among the enlightened few who have abandoned the use of "goal-post tolerances" and PCA, you'll find that assuming normality hampers your efforts at continuous improvement. If the process distribution is skewed, the optimal setting (or target) will be somewhere other than the center of the engineering tolerance, but you'll never find it if you assume normality. Your quality-improvement plan must begin with a clear understanding of the process and its distribution.
Failure to understand non-normality leads to tampering, increased reject rates, sub-optimal process settings, failure to detect special causes, missed opportunities for improvement, and many other problems. The result is loss of face, loss of faith in SPC in general, and strained customer-supplier relations.
Learning without understanding
"The problem is that most courses teach what is wrong."
-- W. Edwards Deming - Quality Founder - (see Deming's Quotes)
Like in Physics, many people seem to learn Statistical Process Control through Six Sigma without really understanding. This flaw pattern in conventional education has been underlined in Richard Feynman's book chapter untitled "Learning without understanding":
"I often liked to play tricks on people when I was at MlT. One time, in mechanical drawing class, some joker picked up a French curve (a piece of plastic for drawing smooth curves - a curly, funny-looking thing) and said, "I wonder if the curves on this thing have some special formula?"
I thought for a moment and said, "Sure they do. The curves are very special curves. Lemme show ya," and I picked up my French curve and began to turn it slowly. "The French curve is made so that at the lowest point on each curve, no matter how you turn it, the tangent is horizontal."
All the guys in the class were holding their French curve up at different angles, holding their pencil up to it at the lowest point and laying it along, and discovering that, sure enough, the tangent is horizontal. They were all excited by this "discovery" even though they had already gone through a certain amount of calculus and had already "learned" that the derivative (tangent) of the minimum (lowest point) of any curve is zero (horizontal). They didn't put two and two together. They didn't even know what they "knew."
I don't know what's the matter with people: they don't learn by understanding, they learn by some other way-by rote, or something. Their knowledge is so fragile."
-- W. Edwards Deming - Quality Founder - (see Deming's Quotes)
Like in Physics, many people seem to learn Statistical Process Control through Six Sigma without really understanding. This flaw pattern in conventional education has been underlined in Richard Feynman's book chapter untitled "Learning without understanding":
"I often liked to play tricks on people when I was at MlT. One time, in mechanical drawing class, some joker picked up a French curve (a piece of plastic for drawing smooth curves - a curly, funny-looking thing) and said, "I wonder if the curves on this thing have some special formula?"
I thought for a moment and said, "Sure they do. The curves are very special curves. Lemme show ya," and I picked up my French curve and began to turn it slowly. "The French curve is made so that at the lowest point on each curve, no matter how you turn it, the tangent is horizontal."
All the guys in the class were holding their French curve up at different angles, holding their pencil up to it at the lowest point and laying it along, and discovering that, sure enough, the tangent is horizontal. They were all excited by this "discovery" even though they had already gone through a certain amount of calculus and had already "learned" that the derivative (tangent) of the minimum (lowest point) of any curve is zero (horizontal). They didn't put two and two together. They didn't even know what they "knew."
I don't know what's the matter with people: they don't learn by understanding, they learn by some other way-by rote, or something. Their knowledge is so fragile."
Shewhart/Deming Statistical Process Control vs Six Sigma
"The problem is that most courses teach what is wrong."
-- W. Edwards Deming - Quality Founder - (see Deming's Quotes)
There are a number of critics around Six Sigma at least around its marketing hype:
- Juran's interview
- Desperately Seeking Sigma
Juran said:
"I don't like the hype, and I don't think the hype is going to last. Something that is as successful as the improvement process gets label after label after label. Those labels come and go, but the basic concept stays. There will be some marketer that finds a new label, finds a way to make that a fad and off he'll go, doing the same thing we did before under a new label."
Apart from the hype critics, Six Sigma (associated with Motorola) does not focus on the background Philosophy of Statistical Process Control founded by Walter A. Shewhart - later popularized by his student W. Edwards Deming - at least in its understanding or practice by the community of users, vendors and consultants. This risks not to be changed with Lean Six Sigma.
Why? Though Common and special causes seem to be listed in the majority of Six Sigma glossaries, people seem to "learn without understanding" - to borrow the title of a chapter (excerpt here) from Richard Feynman's book. Mostly, many presume that their process under study follows the Normal Law Distribution. To illustrate the case, it is interesting to read the testimony of a quality practitionner and consultant in one of his article untitled "Non-Normal Distributions in the Real World" where he said:
"After nearly two decades of research involving thousands of real-world manufacturing and nonmanufacturing operations, I have an announcement to make: Normal distributions are not the norm.
You can easily prove this by collecting data from live processes and evaluating it with an open mind. In fact, the early quality pioneers (such as Walter A. Shewhart) were fully aware of the scarcity of normally distributed data. Today, the prevailing wisdom seems to say, “If it ain’t normal, something’s wrong.” That’s just not so."
In fact the flaw seems rooted in the core definition of Six Sigma as understood by some people like here:
Below is Bain’s explanation of Six Sigma:
“Sigma” is a measure of statistical variation. Six Sigma indicates near perfection and is a rigorous operating methodology aimed to ensure complete customer satisfaction by ingraining a culture of excellence, responsiveness and accountability within an organization. Specifically, it requires the delivery of defect-free products or services 99.9997 percent of the time. That means that out of a million products or service experiences, only 3 would fail to meet the customer’s expectations. (The average company runs at around Three Sigma, or 66,800 errors per million.)
-- W. Edwards Deming - Quality Founder - (see Deming's Quotes)
There are a number of critics around Six Sigma at least around its marketing hype:
- Juran's interview
- Desperately Seeking Sigma
Juran said:
"I don't like the hype, and I don't think the hype is going to last. Something that is as successful as the improvement process gets label after label after label. Those labels come and go, but the basic concept stays. There will be some marketer that finds a new label, finds a way to make that a fad and off he'll go, doing the same thing we did before under a new label."
Apart from the hype critics, Six Sigma (associated with Motorola) does not focus on the background Philosophy of Statistical Process Control founded by Walter A. Shewhart - later popularized by his student W. Edwards Deming - at least in its understanding or practice by the community of users, vendors and consultants. This risks not to be changed with Lean Six Sigma.
Why? Though Common and special causes seem to be listed in the majority of Six Sigma glossaries, people seem to "learn without understanding" - to borrow the title of a chapter (excerpt here) from Richard Feynman's book. Mostly, many presume that their process under study follows the Normal Law Distribution. To illustrate the case, it is interesting to read the testimony of a quality practitionner and consultant in one of his article untitled "Non-Normal Distributions in the Real World" where he said:
"After nearly two decades of research involving thousands of real-world manufacturing and nonmanufacturing operations, I have an announcement to make: Normal distributions are not the norm.
You can easily prove this by collecting data from live processes and evaluating it with an open mind. In fact, the early quality pioneers (such as Walter A. Shewhart) were fully aware of the scarcity of normally distributed data. Today, the prevailing wisdom seems to say, “If it ain’t normal, something’s wrong.” That’s just not so."
In fact the flaw seems rooted in the core definition of Six Sigma as understood by some people like here:
Below is Bain’s explanation of Six Sigma:
“Sigma” is a measure of statistical variation. Six Sigma indicates near perfection and is a rigorous operating methodology aimed to ensure complete customer satisfaction by ingraining a culture of excellence, responsiveness and accountability within an organization. Specifically, it requires the delivery of defect-free products or services 99.9997 percent of the time. That means that out of a million products or service experiences, only 3 would fail to meet the customer’s expectations. (The average company runs at around Three Sigma, or 66,800 errors per million.)
Six Sigma cannot indicate with perfection if the process does not follow normal law and as every practitionner should check, normal law is, more often than not, an exception to the normal law in the real-world.
There is no need of Six Sigma to become a "Black Belt".
There is no need of Six Sigma to become a "Black Belt".
Process Capability
Frequently, manufacturers and others are required to estimate the capability of a process, i.e. they require a reasonable estimate of what proportion of their products will conform to certain specifications.
Subscribe to:
Posts (Atom)