How do you measure the performance of an organisation delivering software, starting from product design right through to operations?
## Considerations
The following factors should be considered when determining how to measure the performance of software delivery within an organisation:
1. Focus on **global outcome**. Don't incentivise one team such that it would pit them against another (e.g. Developers incentivised to build fast and throw things at Ops team who are incentivised to go slow and stable)
2. Focus on **outcomes, not outputs**. Don't reward busywork which isn't in helping reach the organisation's goals.
## Criteria for measurement
[[Accelerate (Book)]] defines four key measurements:
- **Delivery Lead Time**. The time it takes from a customer making a request to the request being satisfied. Two constituent parts:
1. **Design Lead Time** — Time to design and validate a product feature
2. [[Product Delivery Lead Time]] — Time to deliver the feature to customers. The second part is easier to measure as it has less variability.
- **[[Deployment Frequency]]**
- **Time to Restore Service** — [[Operational metrics for a distributed application#MTTR Mean Time To Recovery]]
- **[[Change Fail Rate]]**
## My #OpenQuestions and observations
- How many orgs actively measure any of these?
- How accurately would these need to be measured for an org to get value from them? e.g. they would then know where to prioritise for improvement.
---
## References
- [[Accelerate (Book)]]
---
tags: [[Devops]]