Let me share an observation today that is simple to understand but difficult to notice and correct while it is happening; simply because there are too many incentives at stake. In fact, this thing happens way so often in organizations that I can't help but talk about it.
To explain it well, let me put across a scenario:
Imagine we're at a company called RocketApp. We see that the team is trying to develop an app that will “revolutionize the way people communicate.”
At the top, the CXOs set ambitious targets: they want to launch this app in just six months. They tell their 50-odd people team to ship fast and keep costs low, but they're a bit vague about how exactly to do that.
“Bias for action” is a motto at this company, but there's no clear definition of what kind of risks are acceptable with shipping fast and breaking stuff — especially when it comes to possible glitches or security breaches in the app.
Now, let's talk about the coders, the designers, the project managers, i.e., folks on the proverbial front lines. They're the ones who are actually building this app and making the day-to-day decisions about how to use resources, manage costs, and mitigate risks — while the founders are going about trying to secure yet another round of funding. They're the ones who have to take the somewhat fuzzy guidance they're getting from the top and turn it into a real, working product.
Fast forward six months.
The team has been working hard under a lot of pressure, trying to balance speed, quality, and cost. And then, a few weeks after the launch, disaster strikes. There's a major security breach, and users' personal data is leaked.
In the aftermath, you see the founders do a town hall with the team. And what you see in the postmortem of the incident is people assigning blame with the benefit of hindsight:
“Oh, that coder made a mistake in the encryption algorithm,”
“The project manager should have caught this issue.”
“We should've gone with a better hosting service.”
or whatever other locus of blame crops up.
Now, what you — as a third-person observer — see is that team is conjuring up a narrative for why the failure occurred and establishing cause-effect relationships post-facto, while what transpired during the building of the app was starkly different.
You saw that the coders and project managers were operating under intense pressure to meet ambitious production targets, with ambiguous instructions about balancing efficiency, cost, and risk. They were making the best decisions they could under the circumstances, but those circumstances were set by the organization itself!
So, it is easy to blame specific things for creating the mishap after the fact; what we should really be looking at is the pressures and ambiguities created by the incongruent communication and incentive structures set by the people at the top, which might have set up those individuals to fail.
Real problems are always systemic in nature: there's no one single rotten apple anywhere spoiling the lot.
If you're a manager, team lead, or CXO concerned about such a thing happening in your organization, here are a few tips:
Make sure your goals, expectations, and the means to reach them are clearly defined and communicated. In concrete terms, state the rules along with their caveats — where the team can safely break those rules in service of larger imperatives.
“While we want to be efficient and cost-effective, it's crucial that we don't compromise on quality or data security. If you're faced with a decision where you feel these areas may be compromised, I want to hear about it. Remember, our success isn't just hitting a launch date; it's delivering a safe and reliable product to our users.”
Ensure your team has the necessary resources — time, budget, personnel, equipment, and the right incentives — to accomplish their tasks. Don't just ask for efficiency, provide the means for it.
“If at any point you feel these resources are insufficient or could be used more effectively, let's discuss it.”
Clearly outline what risks are acceptable and which are not. This includes considering both low-consequence, high-frequency risks and high-consequence, low-frequency risks.
“If you think an integration allows us to ship a feature in 1/10th the time but might increase our risk significantly or breaks the user experience often, let's avoid it.”
Reinforce larger priorities during weekly meetings to encourage a culture where safety and security are prioritized, not just speed or efficiency. This can help prevent high-stakes errors from happening.
“How are we doing against our plan? Are there any risks or obstacles that have come up? Remember, our priority is quality and security. If meeting the deadline risks those, we need to reassess.”
If a team member points out a potential problem, prioritize addressing that problem to build confidence within the team that they're not left to fend for themselves in turbulent seas.
“I appreciate your bringing this to our attention. Let's figure out a good way to solve this. If it threatens our quality or security standards, we'll need to adjust our plan, even if that potentially impacts our timeline. Remember, it's better to flag a problem now than deal with the fallout later.”
In each stage, clear, open communication is key. As a manager, you should not only give clear instructions, but also actively invite feedback and address concerns and misunderstandings promptly.
In complex systems, it is impossible to faithfully assign blame and establish causality in hindsight. Only good foresight and proactive risk management help figure out edge cases to leadership directives. It's easy to proclaim cookie-cutter motivational riffraff, but it is hard to integrate those directives within existing incentive structures without actively trying to address conflicts as and when they come up.
More importantly, as a manager, you should try to make these redressals as public as possible, so that everyone else in the team knows that it is good to voice conflict and concerns, especially when doing so could prevent larger mishaps down the line.
Otherwise, the entire organization is left playing Hide-and-seek with the truth — where what's communicated and what's practiced are miles apart from each other and blame is assigned based on simplistic causal narratives created after the fact.