I dislike the term “labor standard”. I know, it’s a term that’s embraced by the distribution industry, but it basically implies a minimum level of performance. I prefer the term “performance target”. It means we are reaching for a goal and tends to set a more positive perspective on the whole program.
Many companies have taken their engineered labor standards to such extremes of analysis that they have lost sight of the big picture. When you drown yourself in engineering data – overwhelming amounts of data – it’s easy to lose sight of the forest for the trees. There is a simpler way to improve labor efficiency: by measuring and acting on the most important metrics and ignore the rest.
As an economics major in college, they taught us to take a statistical modeling approach to analysis, versus trying to build a formula the predicts the economy. This modeling approach also works very well on any large data set . Particularly, it works exceptionally well in studying employee productivity and developing performance targets.
Two approaches to Engineered Labor Standards
There are two basic approaches you can take in building your engineered labor standards (performance targets): industrial engineering or benchmarking. Both can work, but come with their own strengths and weaknesses. I’ve worked with both methods within the same operation. By comparing the two in the same facility, it’s obvious to see what works and what doesn’t.
Industrial Engineering – Beware the Noise Metrics
Engineers like to measure everything that can possibly be measured. Frequently I see process standards that are based on 15 or more different metrics. My problem with this is that there is an inherent productivity variance in any process. These variances can be caused by things such as height differences, walking speed differences, RF scan gun speed on the WiFi, or millions of other things. These variances create what I like to call “signal noise” and it can create a natural variance of up to 10% on any given standard. The 15+ metrics all have impact on the standard, but many of those metrics fall well within the signal noise that naturally occurs in the process, rendering them statistically insignificant. I call these “noise metrics”. Statistical modeling tells you which of those metrics have a strong correlation factor to the engineered labor standard and which don’t. Don’t waste your time or money trying to capture the noise metrics because they do not impact the standard. Industrial Engineers tend to disagree with this because they think more information is always better, but in practical terms, the complexity and cost of maintenance for such detailed measurements exceed any value you will get out of them.
These numbers came from an actual engineer that did time and motion studies and a multitude of calculations (many more than what I included on this chart). In this case, the deceleration and acceleration of the forklift is nonsensical. Their correlation to labor is so small it becomes irrelevant.
In this example, number of cases is the metric that impacts labor the most, because it requires human labor to hand stack the cases onto the pallet. Then the employees take them somewhere.
The key metrics are “number of cases” and number of pallets (secondary). The rest can be ignored. Focus on what is important, don’t get lost in the details. Examine and take action on the issues that have a big labor impact, and ignore the signal noise.
Benchmarking – A Top Down Approach
Benchmarking starts by determining what measurements are available for the respective process. Then you need to determine how “clean” the data set is. This is where benchmarking can break down in the beginning. Industrial Engineers take a bottom up approach of industry process standards and time and motion studies to build the respective standard for each process. Benchmarking analyzes the data available and attempts to model it. If the data is inconsistent and of poor quality, then you will get targets that are also inconsistent and of poor quality. This problem still impacts the engineering model because the performance tracking comparison to standard will still suffer from bad data since your performance tracking comes from the data and the standards come from the engineer.
However, assuming the data set is clean and relatively accurate, benchmarking will tell you the correlation of each metric to actual performance and can be done quickly and for a fraction of the cost of having a team of industrial engineers analyze your operation. As well, I have repeatedly seen where benchmarking models are more consistent than engineered standards because benchmarking takes into account all information that occurs in the process. Employees who work the process every day have figured out ways to do it better and faster than what an engineer may develop from industry baselines. Every operation is unique and that uniqueness is very difficult to capture through traditional labor standard development. But statistical modeling tells you what is actually being done and how it correlates to each metric, process and employee.
Both Benchmarking and Engineered Labor Standards
The best model is the one that incorporates a bit of both. Our usual approach is to start with benchmarking all processes, often hundreds in an operation. The results will show you which processes are problematic, i.e., have a high degree of variance in results. Once you have built a baseline, then it is helpful to have engineers or your process analysts come in and fine tune those processes that have inconsistencies.