AI is not just a technological threshold. It’s a mirror.
In a world shaped by AI and major social changes, leadership is no longer about control or certainty. It is about navigating between two worlds: the space where yesterday’s models no longer work and tomorrow’s possibilities are still emerging.
We are living in a perpetual in-between, a transition that is not only technological, but also deeply psychological, cultural and geopolitical. The term “disruption” has become almost meaningless in such a moment. What is happening now demands a different kind of attention, for AI is not just another wave of innovation, but a shift in how meaning and agency are organised. It confronts us with fundamental questions about freedom of choice, identity and the frameworks with which societies and cognitive ecosystems organise meaning.
When I bring this up in conversations with senior leaders, or when I prepare for my next keynote speech, I know that I evoke fear and even anger in some people. Not because the subject is unfamiliar, but because it touches on something more disturbing than strategy or implementation. It touches on the question of what remains uniquely human as systems become increasingly sophisticated and begin to reflect human language, judgement and emotional tone.
The fascination with AI is rarely about the technology itself. It is about what we project onto it. Anthropomorphism, our tendency to attribute human characteristics to non-human entities, is an ancient phenomenon, but AI amplifies it in unprecedented ways. This is not a new impulse. Already in the 1960s, the so-called ELIZA effect showed that even a rudimentary chatbot could elicit emotional responses and assumptions about intent. People knew they were communicating with code, and yet still experienced the interaction as relational.
Today, that dynamic has become more complex. Cognitive ecosystems not only respond, but they take agency. They simulate coherence, empathy and even wisdom. This is where the risk begins to shift. Not in the idea that machines are becoming human, but in the possibility that humans are adapting to machines and flattening their own complexity to fit into algorithmic logic.
This psychological dimension is inseparable from the broader geopolitical landscape. Different regions are developing AI within radically different value systems, and those differences are no longer abstract. We are entering a phase of growing Ethical Divergence; navigating the risks and opportunities posed by regulatory fragmentation and geographical silos between Europe and the rest of the world. This divergence is already becoming tangible, as major technology companies begin to treat Europe as a separate regulatory environment, delaying or adapting AI-related releases in response to European constraints. When Apple or Google considers different product trajectories for Europe than for the United States, the Middle East, or Asia, it signals that fragmentation is no longer a future risk but an emerging reality.
The question, then, is not simply how leadership should develop in this liminal space, but whether we are willing to face what this moment asks of us. These shifts are not distant or abstract. They are already here, confronting us directly with the question of what truly makes us human.
That is why leadership now begins with the courage to look into the mirror, rather than putting our heads in the sand. It requires a learning mindset grounded in the very competencies that define human agency: imagination, curiosity and what E.O. Wilson once called “consilience”: the ability to connect knowledge from different disciplines into a meaningful orientation
AI is not just a technological threshold. It is a mirror. The challenge is whether we can look into it, and still choose to remain fully human.
Katja Schipperheijn is an internationally awarded author, strategist, futurist, keynote speaker & entrepreneur. Her books Learning Ecosystems & The Learning Mindset profoundly impacted the way we approach learning, innovation and leadership. She is also member of the jury of the CDO of the Year Awards.