The Shape of a Flock
Credit: Neil Clarke, Starling Murmuration on the Somerset Levels, 2020
There is a moment when a murmuration of starlings becomes something other than birds. Individual creatures, each responding only to the seven nearest neighbours, together produce a shape of such fluid intelligence that it appears to think, decide, feel, and celebrate simultaneously. Nobody directs the murmuration. There is no conductor, no algorithm, no central intelligence. There is only the sum of local responses producing something spectacular.
AI seems, at first glance, to be the culmination of the very thing a murmuration transcends: centralised control, deliberate engineering, the will of individuals imposed upon a system. And yet the more I sit with it, the more I find the biological principle of self-organisation offering something that our standard cultural narratives about AI - uniformly triumphalist or uniformly catastrophist - conspicuously fails to provide. It offers a third way: the possibility of provoked wonder.
[Context: Self-Organisation] In biology and complexity science, self-organisation describes the process by which a system develops structured patterns, behaviours, or intelligence without being directed by an external controller. Examples include ant colonies, the formation of snowflakes, the neurological development of the brain, and the growth of coral reefs. Complexity, even apparent intelligence, can arise from simple, local interactions between many agents following simple rules.
The Pragmatic Case for Wonder
The pragmatist in me recognises what the cynic in me resists: that the internet - our last great technological revolution - was itself a form of cultural self-organisation, one that we did not predict and cannot fully account for. We reasonably worried about what it would destroy. We did not anticipate Wikipedia, nor the proliferation of mutual aid networks during a pandemic, nor the emergence of entirely new art forms from the creative friction of global connectivity. The system, once released, organised itself in ways its architects hadn’t planned for.
This is not a naive argument for technological optimism. It is an argument for humility in the face of complexity. Artificial intelligence, as it is woven into the fabric of culture and knowledge progression, shares the character of self-organising systems: outcomes are not determined by single intentions. The emergent possibilities, both appalling and extraordinary, are uncharted; just as murmurations create beauty through decentralised simplicity, AI evolves unpredictably, from many minor choices rather than top-down design. The murmuration does not know it is beautiful. But we do.
The Cynical Truth the Murmuration Also Tells
Self-organisation is morally neutral. The principle that generates a murmuration also generates a tumour, a wildfire, a financial crash. The slime mould that elegantly maps the Tokyo railway network is equally efficient when consuming a corpse. Biological self-organisation follows rules, not values, and this is where the metaphor both illuminates and warns.
The current trajectory of AI development exhibits precisely this moral indifference. Systems optimise for the metrics they are given; what they are given reflects the values, biases, and commercial imperatives of those who give them. Left to organise purely around engagement, social media algorithms produced radicalisation rather than enlightenment. A pattern emerged, as it always does, but it was the pattern of a wound, not a wing.
So the question that self-organisation poses to AI is not whether complexity will emerge - it will - but who is tending to the conditions from which emergence arises. A murmuration needs clean air to fly. An intelligence needs an ethical ecosystem in which to become something worth becoming. The optimism I am reaching for is not passive. It is the optimism of a gardener: attentive, patient, willing to get their hands into the soil of what we are growing, not merely observe it from above.
In Conclusion
The murmuration is not only a metaphor for what AI might become, but for what our relationship to it must be. We are not outside the system, observing. We are among the birds, responding to one another, shaping through our local decisions the global pattern that emerges. The shape of what comes next is not determined. It is being written, right now, by the accumulated weight of our choices about what we pay attention to, what we value, and what we refuse. Perhaps that is precisely where the optimism lives. Not in an algorithm's promise, but in our own.