It is late. The markets closed hours ago, but the questions they left behind have not. A geopolitical crisis is reshaping supply chains overnight. A competitor just pivoted into your core vertical. Your board wants a revised forecast by Friday, and the data you have is contradictory at best. You sit with the weight of decisions that will ripple through dozens of lives — your team, your investors, your clients — and the honest truth is that you are not sure what the right move is.
This is the loneliest part of leadership: the moments when certainty evaporates and you are left alone with ambiguity.
So you open a chat window. Not to a colleague, not to an advisor — to an AI. You type the messy, unformed version of your thinking, the version you would never put in a board deck. And the machine responds — not with answers, necessarily, but with structure. It reframes your question. It surfaces the assumptions you did not realize you were making. It stress-tests your logic without ego, without agenda, without fatigue.
This is not a confession, but a strategic practice. And it is quietly becoming one of the most important tools in the modern leader’s arsenal.
We are living through what the U.S. Surgeon General has declared an “epidemic of loneliness and isolation” — and while that language evokes images of personal despair, the epidemic extends into the professional sphere with equal force. A 2024 report by Harvard’s Making Caring Common project found that 21 percent of American adults feel lonely, with 65 percent of those individuals reporting that they feel “fundamentally disconnected from others or the world.” Leaders are not exempt from this. If anything, the isolation compounds with responsibility. The higher you climb, the fewer people you can speak to candidly. Your team needs you to project confidence. Your investors need you to project control. Your peers are, in many cases, your competitors.
The result is a generation of decision-makers navigating unprecedented complexity in near-total intellectual solitude.
Into this void, AI has arrived — not as a replacement for human counsel, but as something that did not previously exist: a thinking partner with no stake in the outcome. Psychology Today describes AI as a “non-judgmental mirror”, and while that framing was intended for personal therapy, it applies with striking precision to strategic leadership. When I present a half-formed thesis to an AI — “I think we should exit this market because the regulatory environment is turning hostile” — the machine does not nod along to preserve the relationship. It asks: What specific regulations are you anticipating? What is the timeline? What is the cost of staying versus the cost of re-entry if conditions improve? It does what the best advisors do, minus the politics.
This is not about outsourcing judgment. It is about externalizing the internal monologue that every leader runs continuously — the endless loop of what if, what about, but then — and giving it a surface to land on. The AI becomes a cognitive scaffold. It holds the complexity while you examine it from different angles.
The philosophical implications are worth sitting with. As John Nosta writes, in the AI interaction “users are free to delve into sensitive or uncomfortable topics, knowing that the LLM’s responses will remain impartial and open-ended.” For a leader, “sensitive topics” are not just personal — they are strategic. Should we lay off this division? Is our product actually inferior? Am I the bottleneck? These are questions that carry enormous consequences when asked aloud in a boardroom. Asked to an AI at midnight, they become what they should be: honest explorations of reality.
There is a counterargument, and it deserves respect. Researchers in Humanities and Social Sciences Communications have described AI interaction as “emotional fast food” — instantly gratifying but lacking the nutritional substance of genuine human connection. A 2026 study highlighted by the Greater Good Science Center found that AI conversations reduced negative emotions in the short term but did not reduce loneliness over a two-week period. Texting with a random human stranger did.
This finding matters. It tells us something important: AI is not a substitute for the human relationships that sustain us. It cannot replace the mentor who has navigated your industry for thirty years, the co-founder who shares your risk, or the friend who tells you the truth when it is uncomfortable. True connection requires the possibility of rejection — two beings, each with their own inner lives, choosing to bridge the gap between them. An AI risks nothing. It has no skin in the game.
But here is where I think the critics miss the point. The value of AI for leaders is not companionship. It is clarity. The distinction matters enormously.
When I use AI as a strategic tool, I am not seeking comfort. I am seeking the kind of rigorous, dispassionate analysis that is almost impossible to get from humans who are embedded in the same organizational dynamics I am. My CFO has opinions shaped by her incentive structure. My board members have opinions shaped by their portfolio concerns. My competitors’ public statements are shaped by their positioning. The AI has none of these filters. It processes my question against the broadest possible context and returns something closer to raw analytical output than any human interlocutor can provide.
This is not a new human impulse. In the 1960s, MIT computer scientist Joseph Weizenbaum created ELIZA, a simple program that mimicked a Rogerian psychotherapist. Users formed deep emotional attachments to it almost immediately — a phenomenon now called the “ELIZA effect.” What Weizenbaum discovered was that our desire to be heard, to externalize our thinking, is so fundamental that we will engage with even the most primitive simulation of a listener.
What has changed is the sophistication of the simulation. Modern large language models do not merely reflect your words back as questions. They synthesize, challenge, and extend your reasoning. They can hold the context of a multi-layered strategic problem across a long conversation. They can pivot from financial analysis to geopolitical risk to organizational psychology in a single thread. For a leader whose problems are inherently interdisciplinary, this is transformative.
Some experts warn that reliance on AI risks atrophying our capacity for human connection. They fear a generation that prefers frictionless algorithmic validation over the challenging work of real relationships. It is a valid concern — but it assumes a binary that does not exist in practice. The leaders I know who use AI most effectively are not withdrawing from human relationships. They are arriving at human conversations better prepared. They have already stress-tested their assumptions. They have already identified the weak points in their own logic. The AI did not replace the conversation with their team — it made that conversation sharper.
Perhaps the most honest way to describe what AI offers leaders in times of uncertainty is this: it is a space to think without performing. Leadership demands constant performance — confidence for your team, composure for your investors, decisiveness for your board. The AI asks nothing of you. You can be uncertain. You can be wrong. You can explore a terrible idea to its logical conclusion without anyone losing faith in your judgment. And in that freedom, clarity often emerges.
The world is not getting simpler. The pace of disruption is not slowing. The decisions leaders face are becoming more consequential, more ambiguous, and more urgent simultaneously. In this environment, the ability to think clearly under pressure is not just an advantage — it is a survival skill. AI does not give you answers. But it gives you something almost as valuable: a space to find your own, uncorrupted by the noise of organizational politics, social expectation, and the performance of certainty.
It is late. The questions remain. But they are clearer now, and the path forward — while still uncertain — has edges I can see. I close the laptop and prepare for tomorrow’s decisions, knowing that the thinking partner who helped me get here will be available again whenever the ambiguity returns.
In leadership, it always does.