B.A. Geo., MBA Culture,
MSc Process Safety,
MBA Sustainable Energy
Data Scientist and Sustainable
Energy Professional
brendon.james@outlook.com
We do not fail because we lack information. We fail because we do not act on what we know—especially when acting is uncomfortable.
Humans tend to avoid conflict, and we work within systems where those bringing bad news are punished and those challenging the status quo are labelled disagreeable. This is part of our challenge as leaders: to shift the discussion from, “go along to get along,” to, “how can we make this better?” Constructive disagreement, grounded in facts and free from personality, must become the norm.
Achieving this requires deliberate focus on psychosocial safety: environments where individuals feel secure speaking, questioning and contributing to decision-making without fear, where thinking is shared and trust is high.
This is easier said than done. However, neglecting leadership in this area can have devastating impacts. A widely examined example is the Boeing 737 Max crisis.
Two tragic accidents—Lion Air Flight 610 in Indonesia (2018) and Ethiopian Airlines Flight 302 (2019)—claimed 346 lives. Investigations revealed that a flight control system known as MCAS played a critical role. More importantly, internal communications showed that concerns about the system had been raised during development. The information existed but did not meaningfully influence decision-making.
During US Senate hearings, Boeing’s leadership was questioned; why weren’t these concerns escalated? What emerged was not simply a technical failure, but a deep organisational issue, one where signals were present but not urgently acted upon. This was not just an engineering problem, but a decision-making problem within a complex system. That is where the real risk resides.
Closer to home, on August 2nd, 2010, Trinidad and Tobago experienced one of its most significant flooding events in recent history. Many communities were overwhelmed as water levels rose rapidly, impacting homes, infrastructure and livelihoods. Petrotrin’s refinery was shut down because certain areas were under 14 feet of water. The rainfall was intense, but the vulnerability was known.
Flood-prone areas had been identified. Drainage challenges had been discussed. Signals existed across reports, observations and lived experience, yet, like many complex events, the issue was not the absence of information. It was integrating those signals into decisions, planning and action. In complex systems, risk rarely appears suddenly.
It accumulates, building through small, often overlooked signals—unchallenged assumptions, conversations not fully explored and through uncomfortable decisions delayed. The timely theme for World Day for Safety and Health at Work emphasises the importance of healthy psychosocial working environments, particularly as digital transformation reshapes workplaces.
Today, organisations operate in environments defined by data, speed and complexity. We have more information than ever before—real-time dashboards, predictive analytics and digital monitoring systems. Yet, the presence of information does not automatically translate into better decisions.
Sometimes, in fact, it creates the illusion of control when the real question is not whether data exists, but whether people feel empowered to act on what the data is telling them. Psychosocial safety plays a critical role here. When individuals hesitate to speak, concerns are filtered to align with expectations, or difficult truths are softened to maintain harmony, organisations cannot respond effectively to risk.
Silence, in this context, is not passive. It is active, suppressing early warning signals, reinforcing flawed assumptions, and, over time, allowing risk to mature into failure. Active silence becomes even more critical when considering emerging challenges like climate change and the energy transition. We are not short of data. Climate projections, flood models and infrastructure assessments are increasingly available. We understand broadly the risks we face from extreme weather events to system stresses across utilities and supply chains.
Yet, in many decision-making environments, these risks remain acknowledged but not fully acted upon.
Why?
Because acting on them requires difficult conversations. It requires challenging political priorities, rethinking investments, and making decisions essential for long-term resilience rather than immediate benefits.
And again, we return to the same point: Do people feel safe enough to speak? Do systems allow those signals to shape decisions?
If we are serious about building safer workplaces and societies, we must go beyond compliance and procedures, instead focusing on decision integrity, building environments where:
Concerns are raised early without consequences
Data reveals patterns instead of just reporting outcomes Leaders invite challenge and dissent Decisions are examined for both results and how they were made
Leadership, therefore, is not about having all the answers, but about creating conditions where the right questions can be asked—and heard. The flood on August 2nd was not just a natural event. The Boeing 737 Max crisis was not just a technical failure. Both serve as reminders of something deeper.
In complex systems, the most dangerous failures are not technical—they are decisions.
And the difference between resilience and catastrophe often comes down to a single moment: whether someone was willing to speak—and whether the system was ready to listen before silence became the risk.
The foregoing is a weekly column by EarthMedic and EarthNurse NGO to help readers understand and address the climate and health crisis.
