Latest Presentation – 21st Century Portfolio Management

I recently spoke at the LKCE conference in Vienna on 21st Century Portfolio Management. The talk was recorded and is available here http://vimeo.com/52546904 It’s about an hour long.

I’ve now presented this material in Madrid, Boston, Tokyo, Vienna, Utrecht, and at various clients around Australia – and each time have found that the contents of the talk has generated a good amount of interest.

The feedback I have been getting, in person after each talk, is that there isn’t a lot out there (in books, articles, blog posts, guidance from agile consultancies etc) on Agile at the portfolio level and beyond, and that much of what I talk about is classed as undiscussables in most organisations.

My shorter Boston talk, that was recorded back in May, has generated over 1000 views (the next most watched being Steve Denning, David Anderson and Don Rienertsen with a few hundred each) which kind of backs this interest up.

The good news is that in Australia we are actually doing what I talk about i.e. it’s not just theory. I hope to publish more on that (and the results) in the future.

Lean Software Management BBC Worldwide Case Study

Dr Peter Middleton and I have had our “Lean Software Management BBC Worldwide Case Study” paper accepted by the IEEE Transactions on Engineering Management. It will be published in the February 2012 issue. The paper was edited by Dr Jeffrey K Liker, author of the Toyota Way.

You can download a copy of the paper prior to its publication here.

I believe it will be one of the most significant papers in Software Engineering this decade.

David Anderson

Abstract

This case study examines how the lean ideas behind the Toyota production system can be applied to software project management. It is a detailed investigation of the performance of a nine person software development team employed by BBC Worldwide based in London. The data collected in 2009 involved direct observations of the development team, the kanban boards, the daily stand-up meetings, semistructured interviews with a wide variety of staff, and statistical analysis.

The evidence shows that over the 12-month period, lead time to deliver software improved by 37%, consistency of delivery rose by 47%, and defects reported by customers fell 24%.

The significance of this work is showing that the use of lean methods including visual management, team-based problem solving, smaller batch sizes, and statistical process control can improve software development. It also summarizes key differences between agile and lean approaches to software development. The conclusion is that the performance of the software development team was improved by adopting a lean approach. The faster delivery with a focus on creating the highest value to the customer also reduced both technical and market risks. The drawbacks are that it may not fit well with existing corporate standards.

Value Delivered

The paper doesn’t include the increase in business value delivered over the period of study. This was due to confidentiality agreements. What I can say is that during the period of study, the digital assets produced rose by hundred of thousands of hours of content, a 610% increase in valuable assets output by software products written by the team.

Authors

Peter Middleton received the M.B.A. degree from the University of Ulster, Northern Ireland, in 1987, and the Ph.D. degree in software engineering from Imperial College, London, U.K., in 1998.

He is currently a Senior Lecturer in computer science at Queen’s University Belfast, Northern Ireland. He is the coauthor of the book Lean Software Strategies published in 2005, and the Editor of a book of case studies on applied systems thinking: the Delivering Public Services that Work published in 2010. His research interests include combining systems thinking with lean software development to help organizations significantly improve their performance.

David Joyce is a Systems Thinker and Agile practitioner with 20 years software development experience of which 12 years is technical team management and coaching experience. In recent years, David has led both onshore and offshore teams and successfully led an internet video startup from inception to launch. More recently David has coached teams on Lean, Kanban and Systems Thinking at BBC Worldwide in the U.K. He is a Principal Consultant at ThoughtWorks.

Mr. Joyce was awarded the Lean SSC Brickell Key award for outstanding achievement and leadership

Programme Level Kanban

I was recently asked

I’m looking for pointers and experience of running programmes with Agile, particularly topics such as:
– team structures
– communication and coordination processes

Rather than Mike Cohn’s Scrum of Scrums, my answer is to use a master Kanban board to visualise the progress of the projects within a programme, which in turn will naturally enhance the communication and coordination process.

The programme board has cards for each of the sub projects or sub feature sets (MMFs) only. The detail for each of these is broken out on each of the teams Kanban boards, not on the programme level board.

This approach visualises what is going on at a higher level, and enables the various representatives from each of the sub teams to collaborate, understand what is coming their way that could affect them, and facilitate synchronisation.

A daily standup is still held, but the rhythm is around:

  1. what is blocking your team, or about to block another team
  2. what work is in progress
  3. bottlenecks (either current or impending)
  4. are priorities clear on what gets pulled next
  5. what needs to be expedited

The standups still run from right to left on the board, in other words upstream; from what is about to released, back all the way to analysis.

Each team records Lead Time and Cycle Time in elapsed working days (some sub project teams may still use points but augment these with LT and CT). This enables those at the programme level to be able to compare teams. Those sub teams with longer Lead Times are asked if they needed more resources, assistance in removing bottlenecks, if we have to go and work on the System conditions etc etc. The only caveat to this is keeping managers in check so that they dont start using these metrics for pointing fingers (at “slow” teams) rather than continual improvement opportunities.

As Dr Peter Middleton says the usual constraint for programme visualisation is the ability of the human mind to handle complexity, this is why tools struggle, as you get to see all individual work items for its sub-projects, too complex! There is no need for one gigantic board/tool visualising everything in fine detail.

Journey to Systemic Improvement – Lean eXchange presentation

Today I gave a talk at the UK Lean eXchange entitled Journey to Systemic Improvement.

My slides can be found here.

Note it is a media rich presentation so the PDF is almost 50MB!!!

A video recording of the presentation and our second running of the Red Bead Experiment will soon be available.

Kanban Results

Over the past year our Kanban teams have been striving to reduce the following:

  • Lead Time – the time it takes from a customer request to when it is delivered
  • Development time – the time it takes from entering the Ready For Development queue to when it is handed off to QA
  • Engineering time – the time it takes from entering the Ready For Engineering queue to when it has passed QA, left Engineering, and is ready for UAT

Through various means; working on the system, talking about blockers first in the standup, actively assigning, escalating and removing blockers, recognising and reducing bottlenecks, retrospectives, improving our process by separating common cause problems from special cause problems, using MMFs and component stories and tasks, implementing Kaizen, implementing classes of service, highlighting items that have been on the board for too long, to name but a few, we have seen improved results which are depicted below in Statistical Process Control charts using data taken from our largest Kanban team.

Note the links in the above paragraph link to other areas of this blog that describe in more detail how each of these have been achieved. You can click on each of the charts below to see a larger version.

Lead Time

Lead time has reduced from a mean of 22 days to 14 days over the past year. There is a consistent downward trend with the majority of the most recent items under the mean. Each of the outliers were proved to be special cause. The periods on the charts have been split from 2008 until our financial year end (April 2009), and from July 2009 until October 2009.

Lead Time Oct 09

Development Time

Development time has reduced from a mean of 9 days to 3 days over the past year.There is a consistent downward trend. This portion of the value stream was directly under the team’s control and not subject to delays from 3rd parties or upstream or downstream parties. The major factor in reducing development time has been to limit work in process. The periods on the charts have been split from 2008 until our financial year end (April 2009), and from July 2009 until October 2009.

Dev Time Oct 09

Engineering Time

Engineering time has reduced from a mean of 11 days to 8 days over the past year. Once again there is a downward trend. However there are more outliers that required investigation. Some of the outliers were proved to be special cause, but the majority were down to waiting for 3rd parties to complete their development and QA, something the team actively worked on to reduce. The periods on the charts have been split from 2008 until our financial year end (April 2009), and from July 2009 until October 2009.

Engineering Time Oct 09

Throughput

We class throughput as the number of items released, and would expect an upward trend as the code base is decoupled, work items broken into MMFs, and cycle time reduces. The chart below shows this upward trend in the number of releases per month. Note that we are subject to release freezes hence the drop from December to February where the release freezes were imposed.

Releases Oct 09

Bugs Per Week

We need to ensure that the reduction in lead and cycle times, and increase in throughput are not at the expense of quality. The chart below shows that the number of live bugs is within statistical control, and since July we are actually seeing a reduction.

Bugs per week Oct 09

 

There are now several follow on posts from this original post

http://leanandkanban.wordpress.com/2009/10/25/kanban-results-feedback/

http://leanandkanban.wordpress.com/2009/10/26/kanban-results-part-2/

 

Design Team Kanban Evolution 2

I blogged previously that our Design team were looking to modify their Kanban board.

Here is a picture of their new board. Note the following:

  • An Express lane for items that need to be expedited through the system
  • Swimlanes for each specialisation
  • Limiting work in progress by limiting Avatar tokens
  • Limiting queues by only allowing a certain number of slots for cards
  • Making blockers visible with pink Post-its stuck to the blocked item, and their own swimlane.

new design Kanban board

Another Kanban Board Example

As I have mentioned previously we have many teams using a Kanban System.

Here is an example of a Kanban board from one of our product teams. Note that they use the following:

  • A star above each lane where a date stamp is required that will be used to produce metrics (lead time & cycle time) to help the team improve. This team literally use a stamp to record the date on the card, gives a nice satisfying thump each time you stamp a card.
  • A No Entry Cross sign to limit work in progress. For example you can only fit 2 cards in the Ready for Review stage so there is no need to write the number 2 above that stage.
  • A Feature (MMF) input queue containing the next MMF ready to pull.
  • A Feature (MMF) currently in progress state. Our other teams are using swimlanes for this but this is quite a small team so will typically only have 1 MMF in progress at any one time.
  • White cards to denote packaged releases, containing release number and other release info, with the related cards pinned behind.
  • Avatars depicting who is working on what item.

.kanban board 2)