Drift
I’ve spent much of my career trying to reduce friction in systems — removing barriers between buyer and seller, disintermediating broken supply chains, smoothing coordination across institutions. Friction was the enemy. Every abandoned shopping cart, every failed transaction, every unnecessary intermediary was a problem to be solved.
Lately I’ve started to wonder whether we removed the wrong kinds.
I read an article recently suggesting that reality itself is not something we passively perceive, but something our brains construct — an internal model continuously updated through interaction with other people and the physical world. What feels solid and unquestionable is actually a shared map, stabilized socially. The article referenced Jonestown — the mass suicide in Guyana — as an example of what can happen when a group’s internal model of reality drifts so far from external correction that death begins to feel coherent.
That idea stopped me.
I build mapping systems for a living. I think about shared situational awareness every day. A map is never the territory. It is a representation, useful only insofar as it updates. It must incorporate new signals. It must correct errors. It must integrate dissenting data points. If it doesn’t, navigation fails.
Healthy systems require misalignment signals.
In engineering terms, friction is feedback.
For seven years I lived as a nomad, moving through languages and cultures. I was more socially isolated from my familiar anchors than I had ever been. The people I knew best were in London; my daily life unfolded elsewhere. And yet it was one of the least psychologically drift-prone times of my life.
Because friction was everywhere.
Ordering coffee in another language. Negotiating contracts across cultures. Misunderstanding humor. Getting lost. Being corrected. Living inside different social norms. Each interaction forced my internal model to update. It was impossible to float too far into abstraction when reality kept interrupting you in another grammar.
Isolation, I’ve come to think, is not simply the absence of people. It is the absence of feedback from the world.
But over-conformity can produce a different kind of isolation. It’s possible for large groups to reinforce internally coherent but externally unstable moral models. Closed feedback loops. Escalating certainty. Shared narratives that become impermeable to contradiction. When dissent collapses, correction becomes difficult.
We are living in a period when many large groups appear to be operating from incompatible maps of what is real, what is dangerous, what is permissible. It is tempting to call this polarization. It may be something deeper — a weakening of the mechanisms that once kept our models aligned closely enough to share a world.
Digital platforms did not invent drift, but they accelerated it. They segment attention into calibrated environments where signals circulate but rarely cross. The natural calibration that once came from shared physical spaces — the same storms, the same harvests, the same public squares — has been replaced by algorithmic alignment.
It is not surprising that a culture that lives inside models and builds virtual worlds begins to suspect that the world itself is code. The simulation hypothesis feels almost intuitive in a society where abstraction has outpaced embodiment. When most of your work exists in symbolic layers rather than physical ones, it becomes easier to treat reality as provisional.
At the same time, new forms of synthetic participation are emerging. Chatbots populate social space. Artificial agents can reinforce narratives without sharing embodied constraint. They do not experience gravity, hunger, weather, or death. Their models are symbolic. If such systems begin shaping shared perception at scale, calibration changes again.
Evolution does not optimize for truth. It optimizes for survival. Shared models can include useful fictions as long as they help a group persist. The danger arises when those fictions stop being adaptive and drift beyond correction.
I believe deeply in technology’s purpose. I have spent years building tools meant to help people see the same landscape at the same time — especially in moments when misalignment has real consequences: storms, disasters, institutional breakdown. Used well, technology can restore some of what small communities once had naturally: the ability to coordinate when the stakes are physical and immediate.
But these same tools can also reduce friction. And when friction disappears entirely, so do the signals that tell us we might be wrong.
If reality is a shared map, then its stability depends on tending — on the continuous willingness to update in response to friction, dissent, and constraint. Not the elimination of disagreement, but the preservation of correction. Not perfect unity, but enough overlap to prevent synchronized drift.
I don’t remember another time in my life when the fragility of our shared maps has felt so exposed. If tending fails, maps do not simply become inaccurate. They become dangerous.
And I’m no longer sure we understand which frictions we can afford to remove.


