Big Data Reveals Faster, Cheaper Labor Standards

Posted on: July 9, 2014

The term, “Big Data” refers to much more than large data sets. It’s an era which is changing how we work and live. The big data age causes us to ask, “how are we using our data?” not just how much of it we have. Recent advances in data technology and hardware have enabled data scientists to solve problems that before were very difficult to solve due to technology limitations.

Historically, science was limited by the amount of data it could analyze. To compensate, scientists came up with statistical methods that allowed them to look at smaller data sets and then model them. With this approach, the data had to be very clean because any errant data could throw off the accuracy of the model. The scientific method approach focused on causality. Scientists would come up with a hypothesis as to WHY something happens then attempt to prove it with data modeling. The challenge here is with the “selection bias” of human nature, i.e., manipulating the data to prove your theory.

Big data takes a very different approach. It does not care about the WHY, only the WHAT. With big data, we now have the ability to quickly analyze nearly infinite amounts of data and determine correlations between data that tell us WHAT happens. To illustrate, big data solved a problem with manhole covers in New York City, detailed in Viktor Mayer-Schönberger’s recent book, Big Data. NYC has 51,000 manhole covers. Every year a number of the covers will launch up to 200 meters into the air due to gas explosions. Engineers identified 106 input factors that could cause the explosion. To solve this problem using the engineering approach, it would have taken several years and would have been cost prohibitive.  Instead, they used the big data approach correlating the 106 input factors on all covers that exploded over the last 10 years. They found that 2 of those 106 factors had a strong correlation to the exploding covers. Then, they identified all the covers where those 2 factors were similar and repaired those issues. The following year there was a dramatic drop in cover explosions.

Big Data Means Big Change in Labor Standards

So how can Big Data be applied to labor management? The manhole example metaphorically relates well with warehouse operations. The classic approach is to engage industrial engineers to measure all factors for a specific process and develop a labor standard. This is a widely-accepted approach and works very well in environments that are static, but it is also time intensive and very expensive, due to the high-cost labor involved (engineer’s salary). The big data approach to labor management takes the available data and then runs correlations between available input factors and results to generate accurate standards with as little as one month of collected data. This may not seem like much data, but an operation with 100 employees will generate upwards of 500,000 process transactions in a month – a significant data set for analysis.

Up until a few years ago, spreadsheets were capped at 32,000 rows. Excel 2013’s latest limit is now over a million rows, but a spreadsheet that size will often overwhelm your computer. Because of these limitations, developing labor standards once required an engineering approach. But now with big data there is another way to get to a similar result.

Big Data Slashes Costs 90%

A client of ours generates over 3 million transactions a day, which would be nearly impossible to analyze via spreadsheet. Using the classical approach of enterprise software and engineered standards, they spent 12 months and well over $1 million implementing an enterprise Labor Management System that ultimately failed. Using the big data approach, we are able to replace their LMS on the cloud, correlate their processes, and get them up and running in a few months. In general, the classic approach costs between $150,000 and $250,000 and several months to develop the engineered standards for an average operation, plus additional costs each time you alter a process. With big data correlation modeling, standards can be tabulated in a few weeks for 10% of the classic approach. That’s 90% cost savings.

The engineering hours aren’t the only factor driving the expense of the classic approach. Frequently the data collection to address all the input factors into a process is just not available. Distance traveled, box weight, box size, the engineer might need all of these types of data and not have it. Why? Because the WMS doesn’t track it. This requires modification/upgrades to the WMS which is very expensive, often costing $millions.

Big Data, Clearer Picture

Big data contains information that an engineering approach will often miss.  Every operation has unique characteristics. This uniqueness shows up in the data models but will be hard to uncover observationally unless you work the process yourself over a period of time. The engineering model calls for watching closely one person on one process for a few hours. Big data will watch 100 people over 30 days on that process.

Big Data finds Superstar Grandma

How does a slow moving 60 year-old grandmother outperform all of her co-workers? When you see the whole picture big data supplies, rather than time and motion studies, it’s easy to spot someone who is performing off the charts during the calibration process – before the labor standards exist. Take for example, Fran. She was an employee at a major clothing brand, working the clothing pick line operation. Fran was not aggressive looking or fast moving. Observationally, she was actually walking slower than the other workers were. The engineering approach would calculate the distance walked between picks, then the time spent on the pick and add them together. Big data revealed she had a performance standard 80% above the engineered standard. How was she out-performing her peers? As it turns out, she slowed down her walk but never stopped when making the pick, she just kept continually moving looking ahead for the next pick. Big data analysis enables you to see these exceptions in the data up front, and incorporate them into your standards. The data on one employee with an exceptional productivity level changed the process.

Modeling the Superstar

Often, as in the case of Fran, employees know how to perform processes better than the engineers. An engineering model would have identified Fran’s stellar performance after the fact, once labor standards were already in place. Once discovered, there would be efforts to train the other people to work like her. Engineers would change and re-calibrate the labor standard to emulate the rockstar. Every time labor standards are re-calibrated, it’s expensive. Using big data to identify up-front the top performers eliminates those costs.

Humans + Big Data = Ideal

Industrial engineering is still very useful to companies in fixing inefficient processes. Often the best is when you combine the analysis of Big Data with the industry expertise of the engineers. Big data allows engineers to test their models with real data. It gives them the tools to do their jobs better. Now they have real data, a concrete validation tool for what they are doing.

Engineers, operations managers, and operators are also essential in aligning the big data standards with reality. Because big data isn’t perfect. Dirty data and noise metrics can throw off the data correlation and result in unrealistic standards. Nobody can walk 42 miles per hour, and if the data suggests it, something went wrong. That’s where humans can bring the standards to reality. It’s important to always verify your big data labor standards with real humans.

Whether you use humans or computers to create your labor standards, big data simplifies and clarifies everything. While no approach is perfect, the two can work together to achieve the best results for productivity on the floor.

Cookies are important to the proper functioning of a site. We take your privacy very seriously. To improve your experience, we use cookies to collect statistics to optimize site functionality, and deliver content tailored to your interests. Click Agree and Proceed to accept cookies and go directly to the site or see our privacy policy for more detail.

OK