HomeBlogsManufacturing and Supply Chain Blog

High Performance Blog

Updates on the world of high performance organizations and the information they use.

  • Home
    Home This is where you can find all the blog posts throughout the site.
  • Categories
    Categories Displays a list of categories from this blog.
  • Tags
    Tags Displays a list of tags that has been used in the blog.
  • Bloggers
    Bloggers Search for your favorite blogger from this site.
  • Team Blogs
    Team Blogs Find your favorite team blogs here.
  • Login
Posted by on in Manufacturing & Supply Chain

Durable Goods Showing Strength


Durable goods orders surged in February (April report), in sharp contrast to the typical slowdown for the month.  

March employment (April report) improved for to above 60% of the population for the first time since 2007. 

Click HERE for the full report.

Hits: 115

Have you ever thought about how the BOM processor works in your ERP system?  Not many do.  The first one was written in 1963, and they still work the same way today.  Put in the count of end items and the BOM explodes the number of intermediate and component parts required.

If you manage a supply chain you've probably wondered if you could do it for the entire supply network.  Rarely been tried due to the limitations of the existing data management technology.

What if you had a different technology platform?  What if you could thread the BOM structure directly, so that every branch was directly connected to the end item.  And what if you could traverse each level in the branch in 32 nanoseconds?  Would that change how you built the BOM?  Now you can with Ancelus.

Now let's really get outside the box.  What if you wanted to run the BOM backwards?  

  • Today we ask the question this way: Here's what I need to deliver, what do I need to allocate or buy?
  • What if we did the reverse: Here's the material I have on hand, what end items in the schedule can I build to completion?

With Ancelus the BOM processor could run the explosion or the implosion with equal speed.

Hits: 1445

It is common for companies with multiple products to compare performance using gross margin.  Allocation of indirect (overhead) expenses at the product level is rarely possible with any accuracy.  So the next best thing is to measure revenue less direct expense (gross profit) as a % of revenue (gross margin). 

The goal in this post is to show how that leads to operating decisions that are opposite of the goals.

To study the behavior of this decision cycle we model a company with two products. Indirect expenses and deployed capacity are assumed to be constant in total, with only the allocation between the products to be decided. 

Base Case

The following simplified income statement shows our starting position.

Hits: 3485

This is going to be a long post. First in a three part series. Hang in there.  The insights will be useful.

One of my disappointments with our industry is the pervasive use of backward-looking accounting methods for forward-looking decision support.  Accounting measures are intended as historical tools. But the independent variables - those things you control to produce the accounting results - generally aren't found in the accounting model.

Fixed Cost, Variable Cost

An example is the Fixed-Varaible Model used to define "break even volume."  It is universally accepted because it fits the structure of the income statement.  It is taught in every accounting course and MBA program.  It is profoundly wrong.  It has some limited use in post-hoc analysis, but as a decision support tool it is dangerously misleading.

 A few simple models will help expose the flaws.

©copyright 2006-2015 Time Compression Strategies
Hits: 5130
Recent Comments - Show all comments
  • John Layden
    John Layden says #
    Rollie; A pretty complete catalog of the factors that induce noise into the process. But that doesn't change the basic behavior,
  • rolliecole
    rolliecole says #
    Aha! a yearbook is actually a collection of two things -- the pictures of each student (as you say, a solo government-licensed pho
  • John Layden
    John Layden says #
    The year book example is a classic case of government created-monopoly. In theory the school board bids the job every few years.

Most of the Six Sigma community considers it to be an analysis technique.  Partly true.  We rarely hear a speed discussion connected to these discussions.  It is the most fundamental element.  Here's why.

1. The objective of the analysis is to find the cases where a process ceased to behave randomly, determine the cause and correct it.

2. A process deviates because something has happened that its designers had not considered.  If they had considered it they would have designed in a way to control it.

3. If the cause was not considered during design, it's unlikely you are collecting data on it.  So exactly what are you going to analyze?  This means you are going to collect more data, but on what?

So a critical element of any Six Sigma program must be speed of detection.  It is imperative to catch the problem in the act, before the gun smoke drifts away.  Without this ability, you are unlikely to find root cause in a timely fashion.

Hits: 4511

Posted by on in Manufacturing & Supply Chain

There's an interesting thread running on the Lean/Six Sigma group at Linked In.  Talking about "what is three sigma."  It has wandered all over the map since January, and at times must be really confusing for newcomers.  The jargon is daunting.

A problem with this movement is that it has attempted to subsume everything in manufacturing into a single "process" for improvement.

Six Sigma was originally much simpler.  Here's an attempt to reduce it to it's simplest essence:

1. Everything measured has a natural variability.  natural variability is distributed randomly.  This defines 3 sigma boundaries for any measured value.  Forget the math for the moment.

2. A process running at Six Sigma has a natural variability of 3 sigma.  (Stick with me here).

Hits: 4842

Posted by on in Manufacturing & Supply Chain

The concept of kanban has held a special attraction for factory managers over the years.  The reason is its simplicity, and the apparent absolute accuracy of responding to the consumption signals of a downstream operation. Of special attraction is the lack of any computers needed in the system. 

But there are some serious limitations to watch out for.  They are routinely missed and can make a real mess.

The easiest explanation of kanban is the "two bin system" used for replenishment signals in low value parts.  Start with two bins of parts.  At the bottom of each bin is a card used to reorder the parts.  One bin is active and the second is reserve.  When the active bin is emptied, the card is removed from the bin and sent to purchasing.  The reserve bin is converted to active.

Simple enough.  And it delivers one of the most important steps in the TCS Return on Capacity strategy for managing supply chains.  It standardizes quantity and makes time the managed variable.  But there are few cases where this simple model can actually be applied.

The point usually missed by fans of kanban is that information flow is time delayed.  Any system that introduces time delays into the information flow, has also introduced oscillations into the supply process.

Hits: 4687

Posted by on in Manufacturing & Supply Chain

We still hear some IT pros discussing the use of Fast Fourier Transforms (FFT) to imitate real time.  After decades of debate on this one it's sad to realize that this myth still persists.  Fast batch is not the same as real time, regardless of how good your transform software is.

FFT doesn't convert data to real time.  It tells you what it would have looked like in real time at some point in the past.  You need to have all the data to run the transform, so by definition the events are in the past.

Real time operates on every data event.  It puts limits on the discovery process, so it frustrates many statisticians.  But that's the real world.  You can't know faster by running smaller batches more frequently.  That just reduces confidence limits.  And flow processes in manufacturing and supply chain still become chaotically unstable when you try.

As a variation on the Heisenberg Uncertainty Principle: You can know now or you can know for certain.  Not both.


Hits: 10117

Posted by on in Manufacturing & Supply Chain

I keep hearing the same story.  "We bought an ERP system so we're finished with the software debate, right?"

That's a form of denial.  The idea of ERP is a grand integrated box of everything you need to run a business.  It not only doesn't work, it's not possible for it to work.  Three main reasons:

1. Bigger integration means broader audience, which means the added modules are more general.  That works for general ledger, but not for much else.  Too much of the business process is unique to a business/customer base combination.  You need something more specific for order entry, not more general.

2. A very technical point - ERP operates in the frequency domain, the world operates in the time domain.  Very difficult to merge the two.  For many business operations it's impossible.

3. For the systems that allow you to customize the business process, the assumption is that it's something you will do once, so it's ok to take two years to get it up and running.  Exactly the opposite is true.  Most businesses are highly dynamic and business process needs to match.  Fixed (standardized) process means inability to adapt to change.

Hits: 7236

There was an ominous piece of information released by Census on the future of housing market.  Ultimately housing demand is driven by household formations.  This number has dropped through the floor.

Census publishes the monthly estimates of US households each quarter.  Derived from other reporting from the industry, there are often revisions.  That's the nature of statistical measurements where source data is reported with some delay. 2009 required a major adjustment as the industry came to a sudden stop.

By taking the difference from the prior year we can estimate the number of households formed in the prior 12 months.

In September of 2013 the estimate of households was reported as only 97,000 above the prior year.  Worst since the meltdown of 2009.

Year-to Year Change - September US Household Estimates Since 2000 (US Census Bureau)

Hits: 6694

Posted by on in Manufacturing & Supply Chain

This year marks the 100th anniversary of launch of the Model T Ford and the assembly line that it made famous.  Time for a little reflection.

  • Ford was able to deliver a finished car within 80 hours from the time the iron ore arrived on the dock.  We can't do that today.  Our supply chains are more "efficient" and "cost optimized."  Ford knew that velocity was the dominant variable.
  • The story is told that Ford had the idea when he was touring a meat packing plant in Chicago (a dis-assembly line, actually).
  • The assembly process was protected from the unstable behavior of managed supply chains by dictating a constant production rate.  Single model, changes in production rate were announced long in advance.
  • To assure compliance Ford owned most of the supply chain.  Iron mines in the UP of Michigan, tachonite mills, ore freighters, steel mill and more.



Hits: 10076
Go to top View Our Stats