In many workplaces, communication focuses on what was done, not why it mattered.
Technical professionals often describe tasks, tools, or deliverables without articulating the outcome. As a result, the real value of their work goes unseen, misunderstood, or underleveraged. This limits alignment, recognition, and learning.
Whether in status updates, documentation, retrospectives, or evaluations, failing to lead with impact creates missed opportunities — for you and for your team.
To help individuals clearly communicate the significance of their work by tying it to measurable outcomes and organisational goals — not just in resumes, but in all forms of professional communication: daily stand-ups, stakeholder updates, quarterly reviews, evaluation meetings, and internal newsletters.
The aim is to build credibility, improve alignment, and create feedback loops that reinforce intentional, outcome-driven problem solving.
This practice encourages you to frame your work in terms of measurable outcomes, not just tasks or deliverables. By using a simple communication structure, you can articulate the impact of your work in a way that is relevant and meaningful to your audience.
Using a combination of
Bottom Line Up Front A communication strategy where the main conclusion or most important point is stated at the beginning of the message. Common in military, executive, and high-stakes contexts where fast clarity is critical.Bottom Line Up Front (BLUF)
BLUF: ensures your main message (the result) is communicated first, not buried in explanation.
Quantified impact statements: ensure that the message is meaningful and measurable.
Together, they form a communication pattern that works especially well in environments with limited attention spans or high decision pressure: status meetings, stakeholder briefings, evaluation sessions, and reports.
Use a simple communication structure to frame achievements:
“We achieved
X
, which mattered because ofY
. We did this byZ
.”
Where:
X
is the measurable outcome (e.g. “We reduced the average response time by 30%”)
Y
is the significance of the outcome (e.g. “which improved customer satisfaction scores”)
Z
is the method used to achieve the outcome (e.g. “by implementing a new ticketing system”)
Not every contribution ends in a clean success. That doesn’t mean it lacks value. This technique also supports reporting failures, obstacles, and partial successes in a way that still communicates clarity, intent, and alignment. With slight adjustments to the formula, you can still frame outcomes meaningfully — even when things don’t go as planned. We will explore a few common variations in the sections below.
Use the same structure, but shift the emphasis from success to learning or avoided cost.
“We aimed for
X
to achieveY
. It didn’t produce the expected outcome, but it revealedZ
. This helped us eliminate a flawed path and redirect effort towardsQ
.”
Examples:
Sometimes the value of your work is in what you uncovered or unblocked, not what you completed.
“While working on
X
, we discovered constraintY
. This helped clarify our roadmap and sparked related initiativeZ
to address it.”
Examples:
When you make significant progress but haven’t reached the full outcome, you can still frame it clearly, and define what should happen next.
“Working towards goal
X
, we didY
. It didn’t fully land yet, but it moved the needle byZ
. As a next step, we canQ
.”
Examples:
The following factors support effective application of the practice:
Objectives and Key Results
A goal-setting framework used to align individual and team efforts with broader organisational priorities. Each Objective defines a qualitative direction or ambition, while its associated Key Results describe specific, measurable outcomes used to track progress. OKRs encourage transparency, focus, and outcome-driven planning across all levels of an organisation.
See: OKRsThe following factors prevent effective application of the practice:
This mindset supports better engineering discipline, by encouraging you to think about the impact of your work, not just the technical details. It helps you:
The more you communicate your work in terms of real-world effects, the more effective, and visible, your contributions become.
While the approach brings numerous benefits, it can also lead to several unexpected or undesired outcomes:
Metric distortion and unintended incentives: When outcome framing becomes standard practice, teams may begin to optimise for the appearance of success rather than meaningful progress. This is an instance of Goodhart’s Law: when a measure becomes a target, it loses its usefulness. In extreme cases, this leads to Cobra Effect behaviours — where gaming or metric-chasing overshadows real value.
Confirmation bias in self-reporting: When individuals are expected to frame work as impact, they may (consciously or not) select only favourable data, ignore context, or over-attribute success to their actions. Without feedback loops, this turns impact reporting into a reinforcement loop that hides flaws and misses learning opportunities.
Perceived competitiveness or self-promotion: When not paired with humility or transparency, outcome framing may make colleagues feel threatened or overshadowed — especially in highly collaborative environments.
Short-termism: A strong focus on immediately measurable results may lead individuals to favour fast wins over foundational or long-term investments, skewing priorities.
Misrepresentation risk: Metrics can be cherry-picked, decontextualised, or exaggerated — especially if the environment rewards visibility over substance.
Visibility over substance risk: Emphasising measurable outcomes can create pressure to inflate or cherry-pick results, especially in environments that reward appearances more than insight. This may contribute to vanity metrics or short-term optimisations.
To mitigate the potential negative consequences of the approach:
Shared framing over self-promotion: Use this structure to foster collective understanding of progress, not personal branding. Frame your impact in the context of team or organisational goals.
Clear attribution and acknowledgement: Be transparent about who contributed and how. Use “we” where appropriate, and avoid overstating individual ownership of shared efforts.
Balanced storytelling: Pair quantitative outcomes with qualitative context — explain what changed, for whom, and why it matters, not just how much.
Indicator integrity: Choose metrics that reflect meaningful outcomes, not vanity or convenience. Prioritise indicators tied to customer, user, or team benefit.
Support for imperfect outcomes: Encourage the use of adapted formats for partial wins, failed experiments, or invisible work (see: Variants). Value the learning, not just the outcome.
Avoid metric inflation: Don’t reward noise. Make it culturally acceptable to report neutral or inconclusive results without penalty — especially in complex or exploratory work.
Build feedback into reporting: Encourage short cycles of review and refinement — not just “what was achieved,” but “what did we learn, and what’s next?”
tip: The goal isn’t to win points. It’s to illuminate the value of thoughtful, goal-aligned work.
Default phrase | Impact-oriented alternative |
---|---|
“I wrote tests.” | We improved code coverage from 60% to 85%, which reduced the number of bugs in production by 30%. |
“Built a dashboard.” | We enabled data-driven decision making, which mattered to 4 product teams monitoring live KPIs. We did this by building a shared dashboard. |
“We finished the migration.” | We achieved 80% fewer deployment failures, which mattered because it stabilised our production releases. We did this by migrating the system. |
“I worked on the pipeline.” | We reduced average build time by 61%, which improved developer feedback loops. |
“I fixed a bug.” | We reduced the number of support tickets by 20%, which improved customer satisfaction. We achieved this by fixing a bug in the payment processing system. |
“I helped improve performance.” | Page load time improved by 37%, which boosted checkout completion by 4.6%. We achieved this by minimizing the front-end bundles, and implementing a Redis cache. |
“Cleaned up old scripts.” | We reduced repo size by 15% and improved onboarding clarity. This mattered because new devs onboard faster. |
“I wrote a blog post.” | We improved team visibility and shared internal knowledge. This mattered because it clarified a common problem. We did this by publishing a blog post that received 200 views and 15 comments. |
Even when numbers are rough, directional improvements help others understand impact. Use approximations (“~20% faster”, “halved retries”, “saved hours per week”) when precision isn’t available.
You’re assuming everything can be turned into numbers. But creative work, long-term foundational improvements, mentorship, or code quality — these aren’t metrics-friendly. If I try to quantify everything, I’ll either waste time inventing fake numbers or distort what matters most.
Not all work is quantifiable at the surface, but most contributes to broader, measurable outcomes. Mentorship affects onboarding time. Code quality affects delivery velocity. Communication reduces ambiguity and failure cycles.
Quantification isn’t about precision or vanity metrics. It’s about showing intent and effect, even through approximations.
tip: It’s not about fabricating numbers. It’s about connecting effort to effect.
Visibility doesn’t always lead to recognition. I’ve seen people play this game: inflate their numbers, frame shared work as personal wins, and get rewarded. Meanwhile, quieter team players who keep things stable are overlooked — even if they deliver real value.
That’s a fair point. The goal here isn’t to play the game, but to make your work visible in a way that is meaningful to you and your team. Communication won’t solve systemic bias. But poor communication guarantees invisibility.
The point is not to game the system, but to increase the clarity and traceability of your contribution. Even in imperfect systems, a well-framed outcome improves the odds of understanding and advocacy.
tip: You can’t control the system, but you can control what people are able to see.
This smells like OKR-speak and resume padding. I get that we should show impact, but turning every change into a mini case study just adds overhead. Most people just want to do good work — not write PR blurbs about it.
That’s fair. The goal here isn’t to turn engineers into marketers. The technique is designed to favour clarity over polish.
Framing impact doesn’t mean performing. It means reducing ambiguity and making sure that the value of your work is legible to others.
tip: Don’t spin. Just say what changed — how much, and why it mattered.