
At a recent conference, I witnessed something that’s become far too common in data leadership circles: genuine surprise that chief data officers consistently cite culture — not technology — as their greatest challenge. Despite a decade of research and experience pointing to the same root cause, conversations still tend to focus on tools rather than the need for culture change.
It reflects a broader tendency in the field: When complexity increases, we tend to look for ways to automate the human element rather than hold individuals accountable. When governance becomes challenging, we question its value rather than its framework. However, governance, approached thoughtfully, isn’t the problem — it’s the only path to sustainable and ethical progress.
Tools alone cannot sort through context, diagnose organizational ambiguity, or make appropriate policy recommendations when it comes to access decisions, risk thresholds, or moral dilemmas around data collection, storage, sharing, or record destruction. Should our healthcare data be shared with AI? Maybe — but how? Under what circumstances? In what environments?
Governance is not just a set of workflows. It’s not a checkbox or a duct-taped policy tacked onto the end of implementation. Real governance is how human beings stay in the loop on decisions that affect them. It’s how we remember who the system is for —and who it may be forgetting. Governance is the last mile of business strategy. It’s the mechanism by which we ask and answer the question, “Can we? Should we?”
No, governance is not easy. But let’s be clear: Ethics and governance are not things we can code our way out of.
Decades of research show why this matters. Karen Stenner demonstrated that under threat, people don’t open up to possibility; they close down, preferring structure and certainty. Jonathan Haidt added that we reason backward from emotion; we tell ourselves stories that soothe. In his 1941 book “Escape from Freedom,” Erich Fromm saw it coming: In times of fear, people don’t flee from authority — they flee to it. We trade freedom for control. Eric Hoffer, in “The True Believer,” added that mass movements don’t thrive on facts. They thrive on narrative clarity. Truth is optional if the story feels total.
What’s important to remember is that human tendencies are not destinies. These are the default settings when people are scared, fragmented, or exhausted. AI, by automating more of our cognition, only sharpens those defaults. So instead of suggesting, “Governance is dead” or, “Let’s cut governance because it’s hard and everyone is bad at it,” we should be asking something harder, more nuanced, and more informed: “If humans tend toward emotional shortcuts, deference to authority, and a craving for narrative closure in times of uncertainty, where is the hope that we’ll do the emotional labor required to slow down and perform the ethical checks AI demands?”
The answer can’t be techno-optimism. It has to be design.
We’re deploying AI into systems already operating under high pressure, high speed, and high uncertainty. If we don’t actively design for conscience, we will default to obedience, certainty, and deference to the machine. That’s not innovation. That’s regression (wrapped in denial).
Hope lives in the structures that protect reflection under pressure. That means governance frameworks, leadership cultures, educational systems, and norms that expect our tendency to flee, and build scaffolding to counter it. Pause protocols. Discernment spaces. Disruption rituals. Accountability loops. Language that reminds people what’s at stake when they don’t check the story.
We don’t need everyone to be a philosopher. But we do need to build systems that make it harder to look away.
If AI tempts us to outsource judgment, then ethics must become a discipline, not just a discussion. Fromm warned that people flee to authority, but that also means authority can be used differently. It can model courage instead of control. Humility instead of hubris. Structure, not domination.
So, the real hope is this: We don’t need everyone to do the emotional labor all the time. But we do need to design systems that make it visible, contagious, and institutionalized — so people aren’t starting from scratch every time. Because left alone, we will not rise to the moment. But if we structure for conscience, we just might stay human in the loop.