Why Six Sigma used the 1.5 sigma shift.

Many of us look at the 1.5 sigma shift used in calculating the sigma level and wonder why. I recall the explanation given to me in my initial six sigma training up at Motorola, many years ago.

When the six sigma program started, there was not a lot of long term data existing. Most efforts were having to create a data collection system as part of the improvement effort. Because of this, the improvement efforts only had short term data available. Data covering 2 to 4 months, at best. The need to make an estimate of the long term performance, say over a year or so, was needed but there was no available estimate of how much more variability would be experienced over the long term as a function of the short term variability.

Motorola shared with us two empirical studies, performed in Europe as I recall, that compared short term standard deviations to long term standard deviations. In these studies they estimated the long term distribution with a process that had the short term standard deviation, but they allowed the mean to shift over the long term. Their model showed that you could estimate the long term variability of a process by allowing the mean to shift +/- 1.5 short term standard deviations.

This lead to the six sigma 1.5 sigma shift. They took these studies to create their assumption that the worst case long term performance of a process could be estimated by shifting the short term process mean by + and – 1.5 short term standard deviations and then estimating the fraction non-conforming to the specification. If the business could accept that level of non-conforming product then the process was considered capable and you went on to the next improvement.

The idea is that if the process specification is six short term standard deviations away from the process mean, then it is a six sigma process. This six sigma process worst case performance, with respect to the fraction non-conforming, would then be estimated after shifting the mean +/- 1.5 sigma which results in the equivalent of a 4.5 sigma yield from a normal distribution using the mean and sigma from the short term process data.

Now what if you had a long term standard deviation? Well you did not include the shift and reported the fraction non-conforming estimate without adding the assumed shift in the process mean. I hope that makes sense.

Now to make the conversions, you will find sigma level tables in books, such as the one in Forrest Breyfogle’s Integrated Enterprise Excellence V3, table L. You need to read the table to see which one to use. You would enter the table with the yield of good product during the period you collected data. If it was short term data, you would use the columns that showed the lower sigma level (because you need to assume it will drift over time). If you have a long term standard deviation, you assume the 1.5 sigma shift is included in the long term data, which allows you to use the columns with the higher sigma level (because the shift is already in the data).

This is confusing. It is why the use of the sigma level has gone out of favor. It is also why statisticians never liked the sigma level concept either.