Variability can be controlled with different kind of combinations. Example if machine receives its raw material from multiple sources the variability of this combined feed will be close to value CV = 1. Despite if sources are LV or HV machines, their summarized feed tend to be medium variable.
Good example about risk reduction comes from financial world. Investors are recommended to decentralize their investments into multiple places. This way bad investment won’t affect on whole portfolio. Correspondingly good investments won’t increase the profit of the whole portfolio. So by decentralizing we will average the profit of our portfolio near the index trend.
How can we utilize this in production?
In previous post I asked which is more variable; time spent to produce one product or batch of products. As in finance example above here when we talk about producing multiple products the time spent on them are summarized and so the variability in single products are summed and the batch time will tend to average. So the time spent on batch production has less variability than when producing single products.
But this doesn’t mean that products should always be produced in batches. Batch production causes other problems but it can be utilized in some cases. Example quality control can be done for batch when values are averaged. Of course the limit values per product needs to be fulfilled.
Lets take an example of computer manufacturer that sells computers with six variable parts (hard drive, RAM, CD, keyboard, processor, memory card) and every part has 3 options. So the computer can be assembled in 36 = 729 different ways. For simplification lets assume that every part costs 100 euros. Whole computer costs 6 * 100 = 600 euros. Demand for each configuration is Poisson distributed with the mean demand of 100 units per year and lead time 3 months.
With safety stock calculator we can compute safety stock for 100 pc yearly demand, and 0,25 year lead time to be 11,63 pieces with service level of 99 %. If we add the demand of 25 pieces during the lead time our stock at first is 37 pieces and average inventory 24,5 pieces ([37+12]/2). This will give us an average inventory of 24,5 * 600 e = 14 700 euros per PC. And the whole inventory would be 14 700 e * 729 = 10,72 million euros.
If we will assemble computers from parts instead of stocking the pre-assembled computers we would have to stock less items (SKU). We would have only 6*3 = 18 stock keeping units instead of previous 729 assembled computers. Each component need to have only 0,991/6 = 0,998326 service level in order to have 0,99 service level for assembled computer. This means an k-factor of 2,933837.
Because computers are sold 100 units/configuration * 729 configurations = 72900 units, demand for each of the 18 components is 4050 pieces per year. This will lead us to an safety stock of 47 pieces and average inventory 553,25 pieces. Then our average inventory value would be 553,25 pc * 100 e * 18 component = 955 850 euros. Modest 90 % drop on the inventory value!
With this simple action we can drop our inventory value and invested capital significantly. We combined the demand of 729 computer into only 18 components and at the same time our warehouse management became easier.
Safety stock alone dropped from 12 * 729 units * 600 e =5,25 million euros to 47 * 18 units * 100 e = 84 600 euros.
One possibility to even out variability is combine queues. If multiple machines are doing the same job they should pick their jobs from the same queue. Think about shop and bank. In traditional bank there is only one queue for multiple officers. This way very variable service times will equalize for the people in the queue. In grocery store there is normally separate lines for each cashier. If someone forgot to weigh his fruits or something else happens the whole queue will have to wait that error to be solved. Some people will voluntarily change the queue. Line hopping makes the system behave more like combined queue but with less efficiency and equity.