"Do Lots" - How Much/Many?
Measure how much raw work product is flowing. In a perfect world, not just in development but to customers. This measure isn't about customer value; it is evidence that the system is moving items (has flow). This measure is also useful for forecasting future delivery when in balance with the other five dimensions.
Too little focus or capability
- Dis-satisfied customers or stakeholders not getting what they need.
- Demand > supply, but you don't know it.
Too much focus
- Declining quality causing defects and re-dos (not "really" done yet)
- Less valuable "easy" features delivered rather than most needed
Typical metrics in this category
- Throughput - count of items or tickets per day/week/sprint
- Velocity - sum story points per sprint
"Do it Fast" - How fast?
Respond and deliver things quickly, given its complexity and novelness. The easiest way to improve this measure is to finish something in-progress before starting something else.
Too little focus or capability
- Customers frustrated in how long it takes to get changes
Too much focus
- Declining quality causing defects and re-dos (not "really" done yet)
- Less valuable "easy" features delivered rather than most needed
Typical metrics in this category
- Time in State - the time an item was within a "state," for example, "In Development."
- Cycle time - the time from start to finish at some boundaries in your system
- Lead time - the time from some commitment to delivery (to the person committed too)
"Do it Predictably" - How consistent is the delivery of value?
Delivery occurs at a consistent pace rather than huge feasts or famine of delivered value to the customers; for example, the variance of pace "Do Lots." This dimension helps see shorter-term process instability (the sustainability metric measures longer-term system stability; it's coming up soon).
Too little focus or capability
- Periods of progress and others of lower value to customers.
Too much focus
- Less risky "known" features delivered rather than most valuable or needed
- Little incentive to push process improvement in case they cause a temporary decline
Typical metrics in this category
- Variability of throughput of velocity
- Variability of the delivered customer value
- Net Process Flow: Things Delivered - Things Started. This measure shows balance through the system with variability represented as a higher or lower peak, with the desired state hovering around zero.
Tip: For a variability measure, consider using the Coefficient of variability: Standard Deviation / Mean rather than the Standard Deviation alone (higher values naturally have a higher Standard Deviation for the same percentage change. Dividing by the mean normalizes that.
"Do it Well" - How good was the quality versus expectations?
A measure of how well the delivery of things that solve a problem or need. Often this measure is called Quality and is one of the hardest measures to get a handle on. The goal isn't purely quality; it serves as an early warning sign that a system is being pushed to deliver beyond its capability.
Too little focus or capability
- Rework. What is delivered needs to be corrected
- Customer dissatisfaction.
- Production issues.
Too much focus
- Little or no delivery of value or flow of items due to "just a little more testing."
- Slow feedback if the wrong thing is built (albeit perfectly functioning)
Typical metrics in this category
- Escaped defects. Defects found outside of the development and delivery team
- Customer satisfaction. Customers don't like what you built and tell you
- Production rollbacks. Second and third releases to get a stable, working system
- Unplanned downtime. Issues in production outside of planned change windows
"Do Valuable Stuff" - How valuable was it to the customer?
A measure of how much value customers derive from released features or projects. The goal isn't purely customer value; it serves as an early warning sign that a system is being pushed to focus on work output rather than an outcome.
Too little focus or capability
- Rework. What is delivered needs to be revisited to deliver "more" of this feature
- Customer dissatisfaction. Internal feeling that work is flowing well, but the customers aren't feeling the value.
Too much focus
- Increasing technical debt. Teams consistently skip technical debt reduction items for supposedly higher value items.
- Lack of prioritization for strategic work that is mid to longer-term (current customers happy, but declining entry into new markets or targets).
Typical metrics in this category
- Cost of delay. An economic view of the cost of NOT doing work to the customer and organization.
- Alignment to strategy. Prioritized work allocation matches a planned strategic allocation
- Customer satisfaction. Customer feedback confirms what was delivered solved a problem with high satisfaction.
"Keep Doing It" - How sustainable is the delivery system (and people)?
A measure of how likely the current performance of the development and delivery system can continue in the future. Often called the "happiness" metric, but it's more important than that label describes. When teams push hard on the improvement of the other metrics, it sometimes takes a toll causing a decline in the future. The goal of this metric is to be an early warning indicator of that gloomy performance in the future.
Too little focus or capability
- The current performance measures aren't maintained.
- The collapse of delivery.
Too much focus
- Stagnate performance improvement over time. The other metrics stay flat.
Typical metrics in this category
- Team health via survey or team retrospective (honest answer to "are we able to continue at this pace?"
- The aggregate of the other performance metrics (D1 to D5) metrics
|