Cultural Foresight: Where Futures Actually Land
Cultural lenses to foresight, behavioural emergence, and the layer most strategy quietly skips
By the time Meta began silentty winding down Horizon Worlds in early 2026, after roughly eighty billion dollars and several years of relentlessly framing the metaverse as the inevitable next platform, the project had become something close to a case study in what foresight tends to get wrong. The hardware mostly worked. The infrastructure was in place. The capital was almost limitless, and the competitive logic — that whoever built the next interface layer would shape the next two decades of digital life — was internally coherent enough to convince a lot of serious people. What was missing was something both more obvious and harder to model: a cultural reason for ordinary people to spend their evenings as cartoon avatars in a low-resolution office park.
Photo by Dmitry Mineev on Unsplash
The Meta example is easy to laugh at in retrospect, but the underlying mistake is the same one that hollows out a lot of foresight work in less visible ways. We tend to treat the future as something delivered through technology, capital, regulation and the occasional shock, and we treat people as the receiving end of that delivery (adopters, users, segments, target audiences). That isn't where futures actually land. Futures land in habits, in identities, in the small daily decisions of people who carry their own histories, beliefs, fears and reference points into every new tool, system or institution put in front of them. They land in what people consider normal, what they find dignified, what they refuse without ever quite explaining why.
That is the territory cultural foresight tries to take seriously, and it is the territory some parts of traditional foresight quietly tend to skip.
Behaviour as a negotiation between forces
If you look closely at how new behaviours actually emerge and not the headline early-adopter behaviours that fill case studies, but the durable ones that reshape how a society eats, works, parents, votes, prays or rests, they almost never come from a single force. They emerge as the negotiated outcome of at least eight overlapping ones, each pulling in its own direction. The interesting analytical work is reading the interplay between them rather than isolating any one.
Beginning with the biological layer, which is also the oldest and least flattering. Appetite, craving, the chase for stimulation, the relief from discomfort. Sugar, scrolling, alcohol, gambling apps, ultra-processed food, the slot-machine logic of social platforms — behaviours that survive every wave of public-health information because they hook into something more durable than education or argument. Foresight that ignores this layer consistently overestimates how rational future consumers, citizens or employees will be, and consistently underestimates how much of the next decade's behavioural shift will simply be old appetites finding new delivery mechanisms.
Above that sits the layer of shared meaning, where people act in ways that feel symbolically or morally right rather than instrumentally efficient. Religion is the obvious example, but most of culture works this way. People light candles in November, follow a football team, name their children after grandparents, refuse to negotiate certain prices, queue politely or refuse to queue at all. Seen from outside, these behaviours look irrational, until you realise they are doing the quiet work of belonging, grief, inheritance and dignity. Any future that tries to optimise these away usually finds them growing back in unexpected forms.
Then there are systems and roles, where institutions slowly redefine what a normal person is even supposed to do. A generation ago, raising a child looked nothing like the optimisation project it has quietly become: measured, scheduled, monitored, photographed, faintly anxious. Patients have become wellness consumers. Citizens have become customers of the state. Users have become creators, performers, and unpaid content workers for platforms whose terms they will never read. None of these shifts feel like decisions while you are inside them; they feel like the natural shape of contemporary life, which is precisely what makes them so easy to miss in foresight that watches only technology and markets.
Technology and affordances do matter, of course, although rarely in the way the hype cycle wants you to believe. The interesting question about a new tool is never whether it works, but which behaviours it makes easier, faster or more visible until they quietly become the default. The smartphone did not predict TikTok. The early web did not predict the comments section. Generative AI is not predicting whatever its dominant behavioural use will be in five years' time, and anyone telling you otherwise is selling something. Affordances reshape behaviour through a slow process of becoming ordinary, and that process is mediated by every other force on this list.
Hype and speculation do their own distinct work, and they are constantly mistaken for genuine adoption. People pick up behaviours because something feels exciting, futuristic or status-conferring, regardless of whether the underlying utility ever materialises. NFTs and the metaverse were the textbook recent cases, but you can see the same dynamic playing out today around certain corners of generative AI, longevity supplements, the more performative kinds of biohacking, and the surge of executives suddenly using the word "agentic" without quite knowing what it means. Hype-driven behaviour can collapse as quickly as it formed, often the moment the cultural mood turns and being early stops feeling smart.
Social expectations are perhaps the most underestimated of the eight. Once Amazon trains a generation to expect next-day delivery, it starts feeling absurd that the public hospital cannot text a clear appointment time or that the doctor won’t just hand you a prescription of what your favourite AI recommended. Once Spotify trains people to expect personalisation, they begin to resent any institution that treats them as a category rather than a person. Anthropologists call these liquid expectations, and they flow silently from one sector into another, resetting the baseline for what people consider acceptable service, communication or care. Many of the legitimacy crises currently facing public institutions are, at root, expectation crises imported from elsewhere.
Trust and legitimacy gate almost everything else. People act on a message, a system or a technology only when they trust the actor behind it, and the contemporary erosion of institutional trust is not a side issue or a communications problem to be managed. It is a primary force shaping which futures can land at all. Public-health authorities, banks, news outlets, governments, large platforms and science as a whole each operate today on a different and shrinking budget of public belief, which means even technically excellent futures can be vetoed by communities that no longer trust the messenger.
And then there are the countermovements and resistance, which can often be treated as marginal until they are not and frictions stemming from here is where foresight can become really valuable. When a culture pushes hard in one direction like optimisation, productivity, rationalisation, surveillance, acceleration, it generates the conditions for its own opposition. The current resurgence of slowness, embodied practice, spirituality, psychedelics, off-grid experiments, anti-tech sentiment, religious revival in unexpected demographics, and various forms of romanticism and folk craft is not a fringe quirk to be filed under lifestyle. It is what cultures do when they feel over-engineered. Reading these countermovements early is one of the more reliable ways to anticipate which dominant futures will trigger durable resistance, and where the next round of legitimate cultural energy will gather.
None of these forces operate alone, and the most interesting behaviours sit at their intersections. The Ozempic phenomenon makes sense only when you read biology, hype and social expectation together. The reorganisation of work after 2020 was systems, technology, identity and trust all rearranging at once. The slowly falling teenage smartphone hour now showing up in several Nordic countries sits at the intersection of countermovement, parenting roles, and an emerging public consensus that something has gone wrong. Futures arrive as the negotiated outcome of all eight forces, not as the linear extrapolation of any one of them, and treating any single force as the protagonist is the surest way to misread the next decade.
Why the cultural layer keeps getting skipped
If this is so obvious in retrospect, the question is why so much foresight still treats culture as decoration around the main analysis rather than the medium that decides whether anything else takes hold.
Part of the answer is methodological. Culture is hard to quantify, slow to move, and difficult to model with the tools foresight has inherited from strategic planning and economics. The dominant workflows, from scanning databases and plotting drivers to building 2x2 scenario matrices and even sometimes weighting probabilities, work better on things that can be measured. Culture gets folded in as a vague risk factor labelled "social" in a PESTEL column, while the harder analytical work happens around technology, regulation and demand.
Part of it is institutional. The clients commissioning foresight tend to often come from technology, finance, government and large incumbents, and they tend to want futures that look like extensions of what they already do well. A foresight project that takes culture seriously will almost always surface things that are politically uncomfortable: that customers find your latest interface alienating; that the future your strategy assumes is one your own employees do not quite believe in; that the regulator you depend on is losing public legitimacy faster than your model accounts for. These are useful insights, and they are awkward to deliver.
And part of it is generational. The first wave of strategic foresight grew up alongside a great post-war faith in technology, planning and growth, and it inherited some of those defaults. Culture was treated as friction to be overcome rather than terrain to be read. We are slowly outgrowing that, but the inherited reflex is still strong: when in doubt, point at the technology curve.
Stewart Brand's pace layers are useful here. Fashion and commerce sit on top, moving fast. Infrastructure and governance move more slowly. Culture sits beneath them, and nature underneath everything. Most foresight operates near the top of this stack, where things change visibly and quickly, but the layers underneath are where decisions actually stabilise or fall apart. A technology that the surface culture welcomes but the deeper layer rejects is a technology with a short half-life. A policy that contradicts a deep cultural narrative will not be implemented well, however elegantly written. The metaverse, on this reading, was not a failed technology bet at all. It was a perfectly good technology bet placed several layers higher than the layer that decides what people actually want their evenings to feel like.
Reading the layer that does the deciding
Cultural foresight does not have a single methodology, and probably should not. It is emerging, just like the culture it aims to understand better. That makes it closer to a sensibility, a willingness to take symbolic and behavioural life as seriously as technical and economic life. But there are practices that operationalise the sensibility, and a few of them are quietly indispensable.
Fringe mapping is the most undervalued. Most signal scanning collects what is already visible, which means it has already been validated, which means it is usually too late to be a competitive advantage. The interesting material lives at the edges, in contradictions, hacks, rarities and the small extreme practices (see the CIPHER framework by Amy Webb) that hint at where mainstream behaviour might be in five years' time. AI is excellent at compressing and surfacing the mainstream, and considerably weaker at noticing what does not quite fit, which is precisely where the early forms of new behaviour usually hide. Reading the fringe still requires human attention, often participatory and ethnographic, sitting in unfamiliar living rooms, watching what people actually do on a Sunday afternoon, listening past the language they use to describe themselves.
Causal Layered Analysis by Sohail Inayatullah pushes a foresight team to look beneath the surface of a trend, into the systems driving it, the worldviews that make it feel inevitable, and the deeper metaphors silently shaping the conversation. Take the longevity wave. The surface story is medical and economic; people are living longer, money is flowing into lifespan extension. The systemic layer is about ageing populations, healthcare costs and the rise of the longevity economy as a market category. The worldview layer treats the body as a project to be optimised and ageing as a problem to be defeated. Beneath that sit metaphors, seeing the body as machine, youth as the ideal state, ageing as enemy, that quietly determine which longevity futures feel desirable and which feel grotesque. A team that reads only the surface will misjudge where the public mood will turn, which it almost certainly will.
Ethnographic futures work is a whole discipline in itself that brings the analysis back to lived experience. Asking real people to describe their everyday life inside a future scenario, in concrete domestic detail, surfaces the small frictions that no scenario document predicts: the technology that worked in the demo and unravels in the kitchen, the policy that looked clean on paper and produces resentment when it lands on a real Tuesday morning. This kind of work is slow, it does not generate impressive charts, and it consistently catches what abstract foresight misses.
The common thread across these practices is straightforward enough. They treat people as the signal rather than the noise, and they assume that the texture of ordinary life is where the future is actually being negotiated, not where it is being passively received.
A reflexivity that most foresight skips
There is a more uncomfortable layer underneath all of this, and it deserves more attention than it usually gets. There is no neutral view of the future. Every scenario, signal and forecast is filtered through the cultural assumptions of the people doing the work. What feels plausible to a foresight team in Copenhagen will not feel plausible in Lagos, Seoul or rural US. What looks like inevitable progress to one community looks like loss or imposition to another. Whose future, exactly, is the foresight describing, and which futures are quietly being treated as unthinkable because they do not fit the lens of the people in the room?
Cultural foresight, taken seriously, asks practitioners to make their own lens visible. Not to apologise for it, but to know which futures their assumptions naturally make legible, and which ones they hide. The role shifts toward holding multiple visions in dialogue, surfacing the tensions between them honestly, and resisting the comfort of a single confident answer. It is a different form of expertise than the one futurists are sometimes asked to perform on stage, and it produces sharper questions and more grounded options rather than impressive predictions. In my experience it also tends to make the rooms it enters more honest about what they actually know, which, depending on the room, is sometimes welcome and sometimes very much not.
What this asks of organisations
For an organisation, the practical implication is uncomfortable but highly useful. If futures are negotiated culturally and behaviourally, then any strategy that ignores the eight forces shaping behaviour is a strategy partly built on hope. The relevant questions in the room shift accordingly. Which behaviours among your customers, citizens or employees are genuinely shifting, and which are hype that will fade once the cultural mood turns? Which behaviours carry symbolic weight that your dashboards will never capture? Which futures do the people you depend on actually find culturally plausible, and which ones will they quietly refuse, regardless of how much you invest in convincing them otherwise?
A useful test, if you want one: when your foresight outputs reach the people who have to act on them, do they recognise themselves in the picture, or are they being pitched a version of the future they don't quite believe? The first is rare, and worth building toward. The second is the cultural layer telling you it has been left out of the analysis, and no amount of additional signal collection will fix it.
Towards a more honest kind of anticipation
The most useful futures work I have seen has rarely been the most confident. It is the work that knows where its own lens sits, takes the cultural and behavioural layer as seriously as the technological one, and treats human action as the place where futures are actually made, slowly, contradictorily, and almost never in the way the early adopters or the deck-builders predicted. Behaviours emerge from a tangled set of forces that pull against one another. Cultures shape which behaviours stick and which never quite take. Foresight, at its best, is the practice of paying close enough attention to both that you can begin to read the negotiation as it happens, and being honest about the parts of it you still cannot see.
What this rules out, in the end, is a particular kind of false confidence, the kind that mistakes a coherent slide deck for an accurate read of how a society is going to live. There is plenty of that confidence still on offer. The more interesting question is whether your organisation can afford to keep buying it.
Learn more in our Cultural Foresight Masterclass:
How do we anticipate change when people do not behave as the data says they should?
Culture often signals what is shifting before it shows up clearly in reports, forecasts, or dashboards. It shapes how change is interpreted, resisted, adapted, and made part of everyday life. That is why cultural foresight matters. It helps us look beyond surface trends and understand the values, tensions, narratives, and behaviours that make futures feel plausible, desirable, or difficult to accept.
In this practical masterclass, futurist Mathias Behn Bjørnhof (ANTICIPATE) and cultural foresight practitioner Chloé de Ruffray explore how cultural foresight can strengthen strategic foresight by bringing in the human layers too often left out. The session moves from why culture matters in foresight to how to work with it in practice, showing how lived realities, shared meanings, expectations, trust, resistance, and emerging behaviours can all become valuable signals of change.
Across the replay, you will be introduced to a set of cultural foresight tools and ways of thinking, including fringe mapping, Causal Layered Analysis, Thing from the Future, and Ethnographic Futures Research. Together, these methods help move from scanning the edges, to sensing deeper patterns, to imagining more grounded and tangible futures, and finally to translating those insights into strategic reflection and action.
This replay is for strategists, innovation teams, brand and communication professionals, researchers, and anyone curious about how to understand change in a more human, nuanced, and futures-oriented way.
Replay includes:
Recording, slides/workbook, and Q&A recording.