Do your estimates suck? You know they probably do, even if you can’t prove it. We all know that estimates are just white lies we use to keep management happy. But do you know how far off they really are?
Bitter Truth
I’ve long suspected that the time my team is spending on estimates is wasted, but I haven’t had the numbers to back it up. Fed up after another round of mile-off estimates caused us to nearly miss a delivery date, I calculated the correlation coefficient between our estimates and the actual lead times of our user stories.
With the wealth of information stashed away in Jira, it proved to be straightforward. I came up with a Pearson’s r of 0.28 (to no one’s surprise). Given that the scale goes from 0 to 1, that’s some very weak correlation. Just check out this scatterplot:
The upshot of this is that now I can prove that we need to increase our estimates across the board. As Uncle Bob said (fast forward to 17:00), “everything in software takes 3 times longer than you think it should, even when you know this and take it into account.” It’ll be a hard sell to the team (optimisim is an occupational hazard for programmers after all), but the numbers don’t lie.
A New Hope
I’m sure we’re not the only ones in this boat, burning hours and brain cells doing estimates that don’t inform. So I’ve turned this into free and easy to use tool: Have at it Jira users..
I challenge everyone to take me up on my offer and see how much information your estimates are really giving you. Check yourself after every sprint for 3 months and use the observed correlation to get better at estimating.