Tag Archives: SPC

Encouraging Feature-level progress tracking in Kanban

Estimated Reading Time: 4 minutes

One of the key questions project managers and senior management in general ask themselves and their teams on an ongoing basis is – "Are we on track to deliver the scope we committed to, on time". In some environments "on budget" is added to the question.

If you are talking about a Release Scope, the answers are quite similar whether you're doing Scrum or Kanban. If you don't care too much about the budget aspects, a Release Burnup can show you the commited scope, the committed date, and the actual progress in working software towards that goal – Plan versus Actual. If you ARE interested in the budget picture – committed budget versus actual, and are we on track to finishing the release with the budget we committed to – use AgileEVM on top of that. (http://www.infoq.com/articles/agile-evm is a good place to start)

Basically for all of this – you are measuring the amount of done features work compared to the amount of features work originally planned for. Whether sized using effort days, story points, function points, the idea is the same. 

In a conference a couple of months ago I talked about Agile Release Management and covered this subject somewhat. You can check out the slides at http://www.slideshare.net/yyeret/managing-projectsreleases-using-leanagile-techniques

I would add that this expectation of management is what we call Predictability in the Kanban world, and based on some encounters I've had with senior management, we as the Agile community have not been doing a great job at connecting to the expectation of Predictability. In many cases its the opposite – we create the impression that Predictability is a lost cause because everything is Agile. 

In Kanban we try to better connect to this expectation of Predictability/Commitment to the important things. Senior management doesn't care about committing to a sprint goal and meeting it. They care about meeting commitments to deliver a release on time and with feature highlights communicated to the stakeholder community. They care about meeting commitments to deliver certain features on time to internal and external parties that count on this feature to continue and do something else. 

Predictability will continue to be important. The way its measured might change. For now, most teams/projects are indeed evaluated based on the answer to "Are we on track to meeting the release goal on time". We should support those teams with an approach that complements their kanban flow-based workflow. The methodology is all there if you connect the dots. 
The room for improvement is mainly in connecting the dots and providing a structured methodology that can be applied as a framework, as well as better tool support. 

What are the gaps?

First, The thinking around CFD needs to switch from history to also a forward-looking predictive chart. What do I mean? 

Most CFDs you see today focus just on an operational view CFD – what is the current state, as well as history, which can help you improve your process, operation. 

I'm Missing a view of the work needed by a certain date, and whether we are on track to achieve our commitments/goals. Tools that extend the CFD to a view that includes current trend, required trend to meet the goal, and trend of requirements churn can answer this question – you see whether the DONE trending towards the overall committed scope is on time or not. 

One more complication is that of course you sometimes want your board to reflect many releases, not just one. You're working to finish one release, and then you move to another. 
In this case, You probably want this view per-Release on the board. 
 

So we need visibility charts that can aggregate the status of several cards e.g. Feature, Release, MMR, MMF whatever you want to call it. In FDD Parking Lot diagrams are a popular way to convey the status of various features/aspects in a Project/Release. An extension of a Parking lot diagram can be to have a mini-burnup of that entity. So beyond just the status (which is basically the current point of a burnup), you can have a mini-graph showing the status of entities comprising this feature. See below for a sketch of how this can look. ( Note that the Warning Indicators box are taken straight from the organizational dashboard page of LeankitKanban. I recently started to explore the capabilities in this dashboard and find them quite useful to help bring a process under control, and the sort of stuff you might want to look at in an operational review). 

The color of each parking lot / feature can easily be derived from where the actual progress is compared to the expected progress curve. The expected curve can be defined to be Linear (yeah right), S-curve based as David Anderson is fond of, or whatever you think the profile should look like. Once you are below the curve, you start to gain reddish colors. Above it – you are green. With Agile approaches relying on Working Software as a measure of progress, you can really trust those colors… Its no more a watermelon (green outside, red inside – courtesy Kent Beck)

For those interested in the details, here is one way a CFD can be extended to provide burnup capabilities. 

 

 

With this in mind, the mini-burnup in the parking lot can be upgraded to a mini-CFD

Now, with a CFD, some more intelligence can be applied to help determine the color/state of the Feature. High level of WIP can be a leading indicator of problem (but knowing about Little's law and how a CFD looks like you probably know that it will be apparent in the burnup being quite flat as well…). I'm guessing that with time, we will learn to study and identify progress risks using a CFD, beyond the basics we currently use. 

Bottom line – my feeling is that in order for Kanban to cross the chasm into the majority of projects/development groups, who are quite concerned with delivering Releases and Features on schedule, not just with trusting the Flow, we will need to provide more and more tools and focus to support this use case. The core thinking is there, the hunger on the part of the IT world is there as well it seems, so lets go out there and make it happen. my 2c…

 

 

 

Can we PLEASE have some simple measures for our Product Development group?

Estimated Reading Time: 1 minute

A lot of our clients at AgileSparks ask us how to measure their effectiveness. Some of them are already using Agile styles of product development, others are not yet there, and another important variant is the Enterprise with mixed ways of doing things, that wants to get more visibility, and use measures as a way to drive towards improvement. 

As we all know, we need to be very careful what we measure, and on top of that a lot of measurements require a lot of work – and people don't really like to feel they are working for the measurements. They want the measurements to work for them. 

Be very careful of reports for Management that require the team/production floor to go out of their way. 

Having said that, ever since we started to focus more heavily on Kanban and Lean thinking, a couple of simple KPIs have emerged, with the added attribute that if you're already using Kanban to manage your work, you get most of them for free. 

Chris Hefley, one of the guys behind LeankitKanban, was interviewed recently to SPaMCast 100 (which is a very good podcast, worth listening to), and some of the discussion is around this point of Kanban providing great metrics that don't require any effort other than managing the work. 

I've been presenting these kinds of metrics to a couple of clients lately. Here is my presentation. Notice it mainly references Lean/Kanban concepts but doesn't describe them in detail. Go to www.agilesparks.com/kanban or www.limitedwipsociety.org for references to resources about those concepts. 

Kanban early warning using a predictive variant of SPC

Estimated Reading Time: 2 minutes

A Confession. While I'm a great fan of using SPC charts to explore specific cycle times and reduce variation / continuously improve a Kanban System (a great blog by benjamin mitchell), I'm only seeing preliminary results in the field with teams I'm coaching. The main reasons are lack of tooling, lack of incentive to manually manage this, and the fact that teams are not yet mature enough. I'm hoping this will change in the near future. 

With that in mind, a constant concern I'm hearing is that finding out that there was an exception based on SPC is too late. Why is that? Because a classic SPC looks at final cycle time. And Final by definition means the action is over…

A couple of months ago, I read about "average cycle time in column" in the GSK R&D Case Study by Greg Frazer

A couple of days ago, I had the idea that perhaps using the collected history of cycle times in columns/lanes, a current prediction of final cycle time can be calculated for each card in the system. This prediction can be then traced on an SPC-like chart, and exceptions can be identified more clearly (see illustration below for an example)

This reminds me of charts used to track "Due Date Performance" on releases/milestones, which I saw at one client of ours. I later learned they are called Slip Charts

The main difference is that the SPC Y-axis is by cycle time length. In the Due Date tracking chart, its by actual date. I think its probably sufficient to just rely on the SPC charts. I would urge organizations tracking due date performance on releases and milestones to start doing an SPC at that level, even if they don't use Kanban/Agile at the Feature/Story level. They might learn a few things that can help them bring their process under control, and improve their predictability. 

Back to the predictive SPC – no tool that I'm aware of currently offers this capability, but one can always hope. 
I see capabilities such as identifying struggling work items based on exceptions from "average time in lane" as well as overall exception in predicted cycle time key to taking the "early feedback and action" one step forward, and scaling Kanban to be something project managers swear by. 

If you are aware of any of the Kanban tools that provide this – I'll be happy to hear about it.