Why Stuff Fails (“The Thermocline of Truth”)
- Phil Venables
- Apr 19
- 5 min read
For many years I’ve observed the same pattern of failure in projects, programs, issue mitigation and indeed anything that requires more than a trivial amount of cross-organization coordination. The pattern goes something like this: big project starts, and the status reporting is green (good) across the board. Then the project reporting continues to show green until at some point it doesn’t - indeed it often goes bright screaming red instantaneously - and then it’s clear it had all been going wrong for a long time.
I have long wondered what to call this pattern of Green, Green, Green, Green, RED until I read this article about the ”thermocline of truth”.

A thermocline is a temperature barrier between a surface layer of warmer water and the colder, deeper, water beneath it. The analogy persists that in many organizations there is a thermocline of truth that represents a barrier to accurate information coming from those who know reality to the management or higher executive layer who are shielded, implicitly or explicitly, from the truth of particular situations.
Usually this means those closer to issues and projects know the real sometimes problematic situation and those higher ups have a rosier picture. Clearly, this can be a major problem. There are a few causes of this, ranging from fear of reporting bad news, lower level managers wanting to look good, imminent promotions hinging on things going well to good old fashioned optimism that things will work out or get back on track.
There are many other similar patterns that smack of “thermocline” related causes:
Silent unchallenged approvals. Some excessively risky or half-baked project or product tacitly ends up going live because no one had the courage to raise a red flag to stop or pause the situation.
Risk acceptances. A corollary of this is an insufficiently escalated risk acceptance. This typically manifests as a relatively junior product or program owner who is singularly motivated to hit a deadline being the person accepting the risk vs. the more senior leader who will be actually accountable for the negative consequence if the risk is realized.
Sunk costs. The classic sunk cost fallacy is that there has been so much invested in a particular project or program that things must continue because nobody has the guts or the authority to just put the thing out of its misery and start over.
Decision re-litigation down the ranks. An ostensibly correct decision is made at a senior level that doesn’t have total consensus but is still necessary to move forward. An example could be a choice between two competing approaches to deliver a project. Only one way can be selected but it’s important the whole organization gets with the program. While everyone shows commitment on day 1, the team who did not have their approach selected passively aggressively doesn’t give it their all, or worse constantly relitigates the decision and this doesn’t get re-escalated back up the chain.
Some organizations have cultures that push back or avoid these types of situations. Many don’t. And even the ones that do got that by taking some specific actions to drive that culture. So what are the actions that help bust through these barriers to get all levels engaged and focused on reality:
Canary milestones. To counter the Green, Green, Green,….RED pathology it’s important to have interim milestones, perhaps unnaturally constructed warning milestones (canaries) that will fail well before the risk of a critical project failure. This can make it unavoidable to spot things going wrong before it's too late.
Executive pull questions. Executives and other leaders in the relevant reviews, team discussions, program and product meetings, or just on some other regular cadence, can establish a fixed protocol of questions to pull the truth from the environment, for example: “What has to continue to go right for us to keep to the schedule?”, “Let’s go around the room, everyone needs to name one thing they are most concerned about.”
“Go Flight”. The best one, building on this, that I’ve seen in a few organizations is a “Go Flight” process modeled off the NASA Mission Control process. That is, an executive on the lead up to a launch goes around the room (or virtual room) and requires explicit go approvals from people. A similar construct is, thankfully, quite common in design and launch documents in many organizations where explicit approval is required from a selection of leaders and technical domain specific authorities.
Rule Driven Risk Acceptance & Re-triggers. To counter the tendency for risk acceptance to be done at too low a level in the organization, establish a formal rule set that frames different levels of risk need to be reviewed and endorsed at specific levels of the organization. You know when you’ve got this right if your rule set has some potential things needing CEO sign-off. It’s also important to manage this collection of risk acceptance as part of a risk inventory or risk register (more on that here). Revisit previously accepted, possibly long term, risks when the executive who accepted them changes role. If a risk is realized you don’t want the current risk owner to be able to legitimately say, “I never personally accepted that risk.”
Run toward problems. Some organizations are culturally inclined to cultivate leaders at all levels who “run toward problems”. I worked at one organization that was supremely good at this. In fact sometimes leaders were almost competitive to highlight and fix issues even beyond their regular span of control. This was something that was a natural outcrop of its heritage as a partnership where there is an intrinsic shared accountability among leaders. But it also helped that in promotion processes and other rewards people were explicitly graded on whether they did this and whether they “reached for accountability”.
Escalation as a service. People are naturally nervous to escalate risks or issues. This is usually because people have cultivated close working relationships with teams and to escalate feels like something done to that team, a bypassing of them or otherwise trying to get them into trouble. I’ve found that if you keep framing escalation as a “service” to leaders and teams then it helps people get over this i.e. I’m going to partner with you to frame this risk issue so we can both go together up your management chain to get some attention to this important issue. Now, as a leader who might be in receipt of escalations it’s massively important to not be shooting messengers. Most thermoclines of truth are created by leaders who exhibit negative behaviors in the face of bad news.
Bottom line: it’s vital for security, and risk management overall, that leaders ensure they are getting accurate and timely information about projects, issues, risks and that they create a culture - through deliberate action - that can bust through the thermocline of truth. It’s also crucial that leaders in their attitudes and behaviors (shooting messengers for example) don’t create that barrier themselves.