Consider the totality of all human communication, opinion, knowledge, and the mechanisms by which these things change over time. This totality is an example of a hyperobject, but we don't have a good term to describe it. Some people call it the information environment or the marketplace of ideas, but to use such terms is to assume that environmental or economic metaphors are appropriate. For lack of a better term, I will refer to it as public discourse.
This post is an inventory of metaphors I've encountered for public discourse.
Metaphors are useful in much the same way that taxonomies are useful. Noticing sparsely populated sections of a well-constructed taxonomy can prompt us to perceive and articulate phenomena that would otherwise by subject to hypocognition. In a similar way, exploration of the unused parts of existing metaphors can guide us towards new conceptual frameworks on which to base future action.
For example, if public discourse is a marketplace, what are the goods or services being traded? What is the currency? What types of market failure are possible? I've done some of this conceptual algebra in the course of compiling this inventory, but more could be done. In particular, I think it is worth thinking through what regulatory frameworks are implied by different metaphors.
This post is a working document, and will be updated and revised as I encounter new metaphors and my thinking evolves. I try to make each system of metaphors somewhat coherent, but this is a work-in-progress. Throughout, I've included (in blue) suggestions for how the metaphors might be plausibly extended.
Please send feedback or suggestions to .
Economic
Those who have new ideas, or advocate for existing ones, are ‘producers’. Those who agree with an idea are ‘consumers’.
Idioms“marketplace of ideas”
- Words don’t have standardised value. Their meaning depends on who says them and in what context.
- Information has three main properties complicate its role in market transactions. (1) It is an experience good—you must experience an information good before you know what it is. (2) High returns to scale—information has a high fixed cost of production but low marginal costs of reproduction. (3) Often public goods—information goods are typically non-rivalrous and sometimes non-excludable. See Hal Varian, Markets for Information Goods (1998)
Jonathan Stray, in response to critcisms of the engagement-optimising behaviour of online platforms, proposes we think of attention as a something like profit. No media entity would be sustainable if they do not continue to attract attention. But there are downsides to optimising for attention to the exclusion of other considerations.
Idioms“paying attention” / “time well spent” / “liar’s dividend”
Those with attention to give are ‘producers’ of attention, and those who want other people’s attention are ‘consumers’ of attention.
Another metaphor is that of politics as a marketplace of rage, as explicated by Peter Sloterdijk. For now I am classifying that metaphor as a subset of this one, with rage being a particular variety of attention. At some point it may make sense to separate it out as a distinct metaphorical framework.
Idioms“attention economy”
See: Craig Mod on attention accounting.
Idioms“attention mongering” / “attention philanthropy”
Criticisms- Promotes a left-brained conception of attention-as-resource at the expense of a more right-brained attention-as-experience. (source)
- By conceiving of attention-as-resource, it invites its commodification and exploitation. (source)
- More broadly, promotes a view of attention as homogeneous and interchangeable when in reality there are many different varieties of attention. (source)
(of people’s attention)
- Bank notes are produced with anti-forgery mechanisms where an institution makes it hard to create the notes and provides an easily accessible way for individuals and businesses to verify them. Can we create similar content authenticity mechanisms?
The ‘money’ is the benefit people think they will get from paying attention to something.
“liar’s dividend”
An ‘economic crash’ caused by a collapse in the value of truth, implying that we were previously living in a ‘truth bubble’ in which truth was overvalued.
Those who manipulate public opinion are ‘producers’ of influence, those who want to manipulate public opinion are ‘consumers’ of influence.
Implied Interventions- Treat ‘degree of personalisation’ in the information environment as an ‘interest rate’ that increases or decreases the cost of influence and polarisation. This might make sense in the context of the middleware proposal. The personalisation rate would be set by a central institution, akin to a central bank. The personalisation rates on particular recommender systems might be allowed to float around the official rate, but not vary too far.
“manufacturing consent”
Environmental
Ideas exist as ‘lifeforms’ in some environment and are subject to evolutionary pressures. Nadia Eghbal puts forward a darker version of the metaphor, in which parasitic ideas are the primary agents in competition, using humans as host organisms.
Full Fact propose classifying information incidents like natural disasters.
Criticisms- Nature is violent, and if you think public discourse is like nature then you are likely to either oppose discuss or condone immoral discourse as inevitable.
- Important human distinctions (such as those between mindlessness and creativity, determinism and choice, right and wrong) do not exist in nature.
- The respective mechanisms of transmission, variation and selection for memes and genes are very different.
- There is no close informational analogue of a species, organism, cell, or of sexual or asexual reproduction.
- Public Health
Ideas are genes, or the organisms that carry them.
Idioms“fit fiction” / “meme” / “infodemiology” / “going viral”
CriticismsMany ideas are not like viruses in that if you are exposed, you start believing in them. News items have something like this character (because they are new), but political opinions multisided and the direction of the flow of influence is not as simple. This is the distinction between models of opinion dynamics and models of social spreading phenomena.
Autopsy of a metaphor: The origins, use and blind spots of the ‘infodemic’ (2021) by Felix Simon & Chico Camargo.
- What counts as a pathogen? A claim, a story, or a collection of claims or stories? All information, or only ‘bad’ information?
- Diseases are usually not shared intentionally, but information sharing is usually intentional.
- Implies passive audiences who become ‘infected’ with information against their will.
- Unlike a pandemic, there is no single root cause behind the spread of (mis)information.
- Imply that (mis)information can be controlled by something akin to public health measures.
This metaphor is made explicit when reinforcement learning based recommender systems interact with humans.
Such as a ‘parasite’.
Idioms“infodemic”
See Sander van der Linden and Stephan Lewandowsky’s work on psychological innoculation.
“Polarisation is a mental health issue that results from collective trauma and threat perception.” See: the Collective Psychology Project, introduced by Alex Evans here. Also, more on psychological perspective of polarisation.
“information space” / “information environment” / “consume content”
Criticisms- A large part of the ‘information environment’ is online, which differs significantly from intuitions developed in ancestral environments.
- Environmental Protection
- Urban Planning
- Public Health
See: Future Crunch.
Such as ‘pollution’ or ‘wildfire’.
Criticisms- Emergency frames can be ‘emotionally draining and create exhaustion, anxiety, guilt and fear.’ They also empower different groups unevenly, and may create help authorities justify taking undesirable actions. See The political effects of emergency frames in sustainability (2021) by Patterson et al.
- Regulation to improve the quality of information is environmental protection regulation.
Martial
See: this by Renee DiResta.
Idioms“contest of ideas” / “information warfare” / “cognitive warfare” / “cognitive security” / “epistemic security” / “culture war”
CriticismsStephen Flusberg, Teenie Matlock & Paul Thibodeau, War metaphors in public discourse (2018)
- Evokes a sense of fear.
- Some work has found that ‘violent metaphors can influence views towards political violence, particularly in individuals with aggressive traits’ (Kalmoe, 2014).
- Reduces people to the battlefields on which the war is being fought, a type of dehumanisation.
- Promotes an adversarial mindset.
- Establishes an expectation that the ‘war’ will, at some point, end, either in victory or defeat.
- The prevalence of conflict related idioms for disagreement in the English language doesn’t necessarily imply a metaphor for war. They are equally applicable to less adversarial concepts like board games or sports.
War constitutes an emergency frame. Emergency frames can be ‘emotionally draining and create exhaustion, anxiety, guilt and fear.’ They also empower different groups unevenly, and may create help authorities justify taking undesirable actions. See The political effects of emergency frames in sustainability (2021) by Patterson et al.
See also: Iain King, Towards an Information Warfare Theory of Victory (2020).
Social
“on the one hand …, on the other hand …”
Religious
Ideologies are religions and political parties are organised religions. Political activity is religious activity.
Idioms“preaching to the converted” / “preaching to the choir”
Mathematical
The claim that public discourse can be formally modelled as a (massive) stochastic process.
Of course, there are many modelling paradigms that could be used. Options include voter models, agent-based models and models from the field of opinion dynamics.
All individuals occupy a point in the space, the coordinates of which describe their opinions, worldview, and information consumption habits.
I originally saw this on Twitter but I can’t find the tweet. The claim was that certain geometric properties of high-dimensional space fit with our intuition for how the space of public opinion operates. In particular, it is possible for two points in high-dimensional space to be ‘close’ along any one dimension, but that these small differences will accumulate and mean that the Euclidean distrance between the two points is very large. Likewise, in the space of possible worldviews, two individuals may have very similar opinions on the vast majority of topics, and very similar information consumption habits, but that any small differences accumulate and the two worldviews overall are incompatible.
Thank you to Peter Pomerantsev for a conversation on this topic.