# Process Variation Reduction...

## Information | Understanding | Best Practice.

**Introduction to Process Variation Reduction.**

**What is variation?**Variation is how much difference there is within a range of measurements or results. In a process there will be variation across any set of recorded data. Even the most stable or consistent of processes will demonstrate variation if measurements recorded are of sufficient detail. Data variation normally leads to a distribution around a mean (average), for example, if you take a sample set of readings, say height of human beings, the majority of readings will be close to the mean, with fewer and fewer readings the further you go from the mean.

The “standard deviation” for a process, measures how far a given process deviates from the mean of that process. When looking at process variation reduction, the objective tends to be to achieve a process with minimal output variation, i.e. where the standard deviation between sequential outputs (for both products or services) is of the order of 5, 6, 7, …sigma. The widely quoted 6 sigma process in metric terms, will produce 3.4 defects per million (DPM) opportunities, or alternatively if such an output is measured in terms of process capability, equates to capability index of a “2” (i.e. a Cpk = 2)

When seeking to reduce process variation The basic methodology involves measuring process performance by measuring variation, determining the cause(s) of variation, then continually implementing process variation reduction.

As the standard deviation between outputs of a process progressively reduces, the number of defects produced by the process can be accurately estimated and can be driven down very close to zero. However, as variation in a process decreases, each incremental improvement requires more effort and determination, such as continual management focus, design of experimentation, capability analysis studies, ongoing empowered team structures, etc..

A key element in any analysis related to process variation is the concept that data is normally distributed.

Normal Distribution. The normal distribution is also known as a Gaussian distribution, who derived the normal curve, features of which include a maximum at the mean and a symmetrical set of data around the mean.

The most common distribution in quality control is the “Bell-Shaped Curve”. In this type of curve

…….68% of the observations fall +/- 1 standard deviation from the mean

…….95.5% of the observations fall +/- 2 standard deviations from the mean

…….99.7% of the observations fall +/- 3 standard deviations from the mean

**How do we relate process variation to process specifications?**

By taking samples, we can measure the output of the process. If we know that the process is normally distributed, we can then determine the average and the variation, and therefore the actual mean and the standard deviation of the process. Once we know the specifications which should be available for every process output, we can then identify the specification limits, the Lower Specification Limit (LSL) and the Upper Specification Limit (USL).

Once we know the above, we can relate the actual output of the process, versus the specifications required, in terms of standard deviation.

There are a range of statistical approaches to process variation reduction such as control charts, scatter diagrams, capability analysis, run charts, deviation analysis, etc.. The optimum approach very much depends on the process itself and the preferences of those seeking to measure and reduce process variation.

## SPC & Statistical Methods for Process Improvement.

- Process Capability. Variability Reduction. Statistical Process Control.
- Pre-Control. R&R Studies.
- Process capability indices Cp, Cpk, Cpm, Capability ratio.
- Performance indices Pp and Ppk.
- Variable Control Charts.
- Attribute Charts.
- Pareto Charts.
- Individual – X Charts.
- Histograms / Process Capability Analysis.
- Scatter Diagrams.
- Etc. … Etc. …
**Information | Understanding | Best Practice >>>**