Summary presentation is available at the link. November manufacturing data released by the Census Bureau on January 6, 2017.
Contact TCS for details on industry sector performance.
Updates on the world of high performance organizations and the information they use.
Since 2007 the working age population has grown by 22.9 million. But the number employed has only grown by 6.5 million. The number not-employed has grown by 18 million.
Employed % declined from 63% to a low of 58.4% in 2011. Since then it has recovered to 59.7%. Of the 6.5 million jobs created since 2007, over 90% have been part time.
There has been no recovery in employment.
Since January 2007 the working age population has grown by 22.4 million, but we have only generated 5.9 million jobs.
The employed % of the population was at 63% in 2007 and now stands at 59.5%. It is true that this is an improvement from the low point of 58.4% in 2011, but not by much. This is the weakest recovery since WWII.
The employed % of the working age population is shown below. This is the primary argument for the stagnant economy. There has never been a real recovery in the employment picture....
Have you ever thought about how the BOM processor works in your ERP system? Not many do. The first one was written in 1963, and they still work the same way today. Put in the count of end items and the BOM explodes the number of intermediate and component parts required.
If you manage a supply chain you've probably wondered if you could do it for the entire supply network. Rarely been tried due to the limitations of the existing data management technology.
What if you had a different technology platform? What if you could thread the BOM structure directly, so that every branch was directly connected to the end item. And what if you could traverse each level in the branch in 32 nanoseconds? Would that change how you built the BOM? Now you can with Ancelus.
Now let's really get outside the box. What if you wanted to run the BOM backwards?
With Ancelus the BOM processor could run the explosion or the implosion with equal speed....
It is common for companies with multiple products to compare performance using gross margin. Allocation of indirect (overhead) expenses at the product level is rarely possible with any accuracy. So the next best thing is to measure revenue less direct expense (gross profit) as a % of revenue (gross margin).
The goal in this post is to show how that leads to operating decisions that are opposite of the goals.
To study the behavior of this decision cycle we model a company with two products. Indirect expenses and deployed capacity are assumed to be constant in total, with only the allocation between the products to be decided.
The following simplified income statement shows our starting position....
This is going to be a long post. First in a three part series. Hang in there. The insights will be useful.
One of my disappointments with our industry is the pervasive use of backward-looking accounting methods for forward-looking decision support. Accounting measures are intended as historical tools. But the independent variables - those things you control to produce the accounting results - generally aren't found in the accounting model.
An example is the Fixed-Varaible Model used to define "break even volume." It is universally accepted because it fits the structure of the income statement. It is taught in every accounting course and MBA program. It is profoundly wrong. It has some limited use in post-hoc analysis, but as a decision support tool it is dangerously misleading.
A few simple models will help expose the flaws....
Most of the Six Sigma community considers it to be an analysis technique. Partly true. We rarely hear a speed discussion connected to these discussions. It is the most fundamental element. Here's why.
1. The objective of the analysis is to find the cases where a process ceased to behave randomly, determine the cause and correct it.
2. A process deviates because something has happened that its designers had not considered. If they had considered it they would have designed in a way to control it.
3. If the cause was not considered during design, it's unlikely you are collecting data on it. So exactly what are you going to analyze? This means you are going to collect more data, but on what?
So a critical element of any Six Sigma program must be speed of detection. It is imperative to catch the problem in the act, before the gun smoke drifts away. Without this ability, you are unlikely to find root cause in a timely fashion....
There's an interesting thread running on the Lean/Six Sigma group at Linked In. Talking about "what is three sigma." It has wandered all over the map since January, and at times must be really confusing for newcomers. The jargon is daunting.
A problem with this movement is that it has attempted to subsume everything in manufacturing into a single "process" for improvement.
Six Sigma was originally much simpler. Here's an attempt to reduce it to it's simplest essence:
1. Everything measured has a natural variability. natural variability is distributed randomly. This defines 3 sigma boundaries for any measured value. Forget the math for the moment.
2. A process running at Six Sigma has a natural variability of 3 sigma. (Stick with me here)....
The concept of kanban has held a special attraction for factory managers over the years. The reason is its simplicity, and the apparent absolute accuracy of responding to the consumption signals of a downstream operation. Of special attraction is the lack of any computers needed in the system.
But there are some serious limitations to watch out for. They are routinely missed and can make a real mess.
The easiest explanation of kanban is the "two bin system" used for replenishment signals in low value parts. Start with two bins of parts. At the bottom of each bin is a card used to reorder the parts. One bin is active and the second is reserve. When the active bin is emptied, the card is removed from the bin and sent to purchasing. The reserve bin is converted to active.
Simple enough. And it delivers one of the most important steps in the TCS Return on Capacity strategy for managing supply chains. It standardizes quantity and makes time the managed variable. But there are few cases where this simple model can actually be applied.
The point usually missed by fans of kanban is that information flow is time delayed. Any system that introduces time delays into the information flow, has also introduced oscillations into the supply process....
We still hear some IT pros discussing the use of Fast Fourier Transforms (FFT) to imitate real time. After decades of debate on this one it's sad to realize that this myth still persists. Fast batch is not the same as real time, regardless of how good your transform software is.
FFT doesn't convert data to real time. It tells you what it would have looked like in real time at some point in the past. You need to have all the data to run the transform, so by definition the events are in the past.
Real time operates on every data event. It puts limits on the discovery process, so it frustrates many statisticians. But that's the real world. You can't know faster by running smaller batches more frequently. That just reduces confidence limits. And flow processes in manufacturing and supply chain still become chaotically unstable when you try.
As a variation on the Heisenberg Uncertainty Principle: You can know now or you can know for certain. Not both.
I keep hearing the same story. "We bought an ERP system so we're finished with the software debate, right?"
That's a form of denial. The idea of ERP is a grand integrated box of everything you need to run a business. It not only doesn't work, it's not possible for it to work. Three main reasons:
1. Bigger integration means broader audience, which means the added modules are more general. That works for general ledger, but not for much else. Too much of the business process is unique to a business/customer base combination. You need something more specific for order entry, not more general.
2. A very technical point - ERP operates in the frequency domain, the world operates in the time domain. Very difficult to merge the two. For many business operations it's impossible.
3. For the systems that allow you to customize the business process, the assumption is that it's something you will do once, so it's ok to take two years to get it up and running. Exactly the opposite is true. Most businesses are highly dynamic and business process needs to match. Fixed (standardized) process means inability to adapt to change....
There was an ominous piece of information released by Census on the future of housing market. Ultimately housing demand is driven by household formations. This number has dropped through the floor.
Census publishes the monthly estimates of US households each quarter. Derived from other reporting from the industry, there are often revisions. That's the nature of statistical measurements where source data is reported with some delay. 2009 required a major adjustment as the industry came to a sudden stop.
By taking the difference from the prior year we can estimate the number of households formed in the prior 12 months.
In September of 2013 the estimate of households was reported as only 97,000 above the prior year. Worst since the meltdown of 2009.
Year-to Year Change - September US Household Estimates Since 2000 (US Census Bureau)...
This year marks the 100th anniversary of launch of the Model T Ford and the assembly line that it made famous. Time for a little reflection.