The business world has evolved to one of data driven decision making. From our own experience, we have seen a multitude of times where the data paints a completely different story than just managerial observation.
As part of any Continuous Improvement effort, measuring the processes you are trying to improve is critical and that requires data. What is exciting is with the internet of things (iOT) many facets of a business are datafied making it easier and easier to measure baseline and progress.
A good portion of companies that are mature enough in their processes to have continuous improvement initiatives will have software systems that track work flow around the processes they are looking to improve. This data can be invaluable to you by providing the process measurements you need.
The first step is to take an inventory of data systems you have and what information is available in them. For the purpose of this example, we will focus on a distribution center. Example:
Data System Inventory
System OEM Purpose
Time Clock Kronos Permanent Employee time
Time Clock Primetime Temporary Staffing labor time
Warehouse Control System Bastian Conveyer WCS database
Lift Truck Telematics iWarehouse Vehicle operating database
Warehouse Management System Manhattan SCALE Tracks inventory moves
Enterprise System SAP Accounting, CRM and finance
Transportation Management Mercury Gate Tracks delivery
Employee Wearables Modjoul Tracks employee movement
The next step is to determine which systems may have data for the process you are trying to improve. We will assume you have already done a comprehensive process map (to be written about in a subsequent blog) and have identified the key areas you want to focus on.
For this example, we want to reduce our overall variable handling charge per case as measured by total cases shipped divided by total operational labor cost. This is a great macro measurement of overall success on improvement, but it is too general a measurement to improve processes and drive change.
Each process that rolls into the handling charge should be identified and then the data sources that map to the process.
Process UOM Data Source
Receiving Pallets+hrs None/manual tracking
Putaway Pallets+hrs WMS
Picking Cases/Eaches+hrs WMS/WCS
Shipping Cases+hrs WMS
Cycle Counts Cases+hrs WMS
Indirect Labor Hours+% Timeclock
The above process break down shows a gap in data collection with Receiving. In general, receiving is not tracked by the WMS so the labor that is involved in that process needs to be tracked separately and this can be accomplished manually or with a job coding system like Job Trak. The other processes, except indirect labor, are tracked by the WMS and WCS systems. Most WM Systems will show you the total labor hours and UOM by each process which is very helpful from a baseline measurement perspective. If you access the raw Transaction file table on the WMS, you can then get greater detail in the process information. For proper process analysis, you really need this detail. A case is not always a case (variances in cube and weight and location) and having more process metric detail is critical to differentiate between variances that occur in work flow.
For example, if you are focused on your cost per case as a measurement, that cost can very greatly based on number of cases per line (SKU). If you average 10 cases per SKU your cost per case will be much lower than if you average 2 cases per SKU because you have to travel to 1/5th the locations to get the product.
In our view, cost per unit is the ultimate measurement to communicate improvement on productivity because it normalizes all the factors and is easy for everyone to understand. But the more granular measurements of lines, cases, eaches, pallets, travel, locations, etc. are important to have when doing detail process analysis and identifying ways to improve a process. For example, if in your picking process you have excessive travel, investing in a slotting system can help reduce travel, reduce time on process and thus drive down your cost per unit.
The challenge with data is the sheer volume available. Most companies will attempt to use spreadsheets to do this analysis. It can be done, but it is both time and labor intensive and tends to be static in its analysis. A better approach is to use a database system integrated with your internal IT systems so the data can be continuously updated and analyzed. This provides a feedback loop so as process changes are implemented, you can quickly see if they improved things and drove your costs down.
Easy Metrics, through its integration tool Logic Writer, has made it very easy to integrate with multiple data sources and massive volumes of data for analysis. Assuming you have access to the transaction data in your WMS, ERP, CRM, WCS, MES and time clock systems, Easy Metrics can fully integrate that data in as little as a day or two and provide a detailed cost per unit by process and gap analysis of your operations.