The Work You Don’t See
Why expertise often shows up as less, not more
The first time someone questioned one of our reports, it wasn’t because it was wrong. It was because it was short. They flicked through the pages, paused, and asked where the rest of it was. No appendix thick with tables, no page explaining every assumption. Just a recommendation, a banded saving range, and a short explanation of what would actually change their bill.
I remember thinking that this was either the clearest thing we’d ever produced, or a complete failure of signalling. Possibly both.
When people start learning something new, there’s often a noisy phase. You can hear it in the way they talk, the way they write, the way they answer questions that weren’t really questions. Volume becomes a proxy for understanding, not because it helps, but because it feels like proof. I went through that phase myself. I wanted to show the workings, to demonstrate that I’d seen every corner of the problem. Volume felt like safety, not just for the person reading, but for me as well.
With experience, that urge fades. Not because the work gets smaller, but because the shape of the problem becomes clearer. You start to notice which details actually change decisions and which ones don’t. At that point, showing everything stops feeling responsible. It starts to feel like avoiding a harder choice about what really matters next.
This isn’t summarising work for convenience. It’s distilling it responsibly. The work happens upstream, the judgement happens early, and what’s left is shaped for a customer decision rather than for completeness. That distinction matters more than it sounds, because it changes what omission means. You’re no longer hiding effort. You’re taking responsibility for relevance.
The work happens upstream. What’s left is shaped for a decision rather than for completeness.
Our energy reports make this visible. To arrive at a single recommendation, we rebuild bills, simulate tariffs, model renewables, explore battery scenarios, and run sensitivity checks. The intermediate data is sprawling. Showing all of it to a customer wouldn’t empower them. It would confuse them, or worse, push them toward the wrong conclusion simply because something looked precise.
Once we work that way, the question stops being how accurate the model is and becomes how much the customer trusts it. Not whether the work was done, but how much of it they need to see in order to move forward with confidence.
We once back-solved a council-sponsored solar estimate to see how close it would come to our own modelling. After all the detail, the result landed almost on top of ours. Either their software was exceptional, or it was simply good enough. In that case, the distinction barely mattered. The recommendation would have been the same either way. In other situations, though, that difference is everything.
That’s why we’re cautious about putting precise long-term savings numbers in front of people. Year one is defensible. Beyond that, behaviour changes, tariffs move, and choices start to matter more than the model itself. Precision in those cases can create confidence without clarity.
So our reports are banded. Between this and this. The exact number matters less than the direction, and the size of the levers that actually move the outcome. What we’re really trying to give customers is a sense of what’s worth paying attention to, not a figure they’ll anchor on and misinterpret.
Our customers move along the same learning curve. Early on, they want to see the detail. Over time, many decide whether to keep learning, or to outsource the thinking and trust us. That decision quietly changes the relationship. Once someone chooses to trust the recommendation, they’re no longer evaluating the work itself so much as the judgement behind it.
This is where the idea that “less is more” starts to break down. In low-trust settings, first engagements, or environments with high downside risk, distillation is often read as concealment rather than competence. In procurement-led or public-sector contexts, detail functions as institutional insurance. Volume redistributes risk. It makes decisions easier to defend, even when it doesn’t make them any better.
Precision reassures. It does not always inform. In many organisations, demands for detail are less about understanding the decision than about feeling safe making it.
Early confidence is noisy. With experience, expertise tends to distil rather than expand.
The hardest part isn’t getting more information. It’s knowing when you’ve seen enough.
Which brings me back to that first report. The real question wasn’t whether it was light on detail or heavy with it. It was whether the person reading it believed the work behind it had been done. And that belief turns out to matter more than the pages themselves, because once you’re responsible for a decision, the hardest part isn’t getting more information. It’s knowing when you’ve seen enough.

