The Inevitable Evolution of Communication Technologies
Bay Area Nonprofit Develops a Solution to Online Misinformation
Plagues are not generally known for causing outbreaks of rational thinking. As late as the nineteenth century, before it was discovered that a bacterium was the cause of tuberculosis, it was widely believed throughout New England that the disease was caused by the dead consuming the lives of their surviving relatives. The disease, after all, appeared to ravage entire families. This vampiric theory of tuberculosis led to such remedies as exhuming dead relatives, decapitating their corpses, driving stakes through their hearts, burning their bodies, and ultimately eating their ashes, to no salutary effect.
This is but one example of the predicament of the human animal. Our greatest evolutionary advantage is our ability to use the information provided by culture to simplify, interpret, and guide our experience. This is for better and for worse, since when cultural information is distorted, the distortion is what ultimately guides our behavior, whether local rumors suggest that digging up dead relatives and burning their corpses will cure tuberculosis, or global, social-media-fueled rumors suggest that ingesting bleach will cure Covid-19. Understanding the inescapable fact that culture mediates all human experience, injecting mass misinformation into the communication streams of society poses a mortal threat to the health of the social body, and the individuals within it.
The anthropologist Edward T. Hall famously asserted that “communication is culture.” When that communication is deliberately false, whether we’re talking about cigarettes, vaccines, climate change, or politics, we cannot possibly make informed decisions, as individuals or as a society. Obviously, we can only make misinformed decisions, responding to the challenges of the world in ways that are no more helpful than digging up and decapitating a dead grandmother.
In 1990, Hall presciently observed that “Culture can be likened to a giant, extraordinarily complex, subtle computer. Its programs guide the actions and responses of human beings in every walk of life.” Thirty-odd years later, when a major portion of human communication has moved into the digital sphere, it appears that we have literalized his analogy. Our algorithm-driven culture has become a “giant, extraordinarily complex, subtle computer.” The triple communications revolution spawned by the breakneck ubiquity of the internet, social media, and smartphones obviated the editorial bulwarks against misinformation that had evolved around earlier media. If we are to maintain a coherent consensus about reality and how best to meet it, new filters to misinformation are simply inevitable.
In much the same way that spam filters emerged to filter the junk out of email communications, tools must inevitably evolve to filter the junk out of internet discourse (including traditional media and its internet/social media/conversational echoes and distortions). Unfiltered mass misinformation corrupts the communication, and thereby the culture—the shared understandings—of society. And corruption is neither a neutral nor an insurmountable fact. Corruption originally referred to bodily decay, as in putrefactive decomposition, a coming apart of previously integrated cellular systems. Nobody would accept a gangrenous limb on their body, nor should we accept a misinformative corruption of the social body. This degradation of our underlying systems of epistemology—our ways of knowing, individually and together, about our reality—necessitates an immune response. Just as spam email would have rendered email communication worthless if it had not been effectively filtered, mass misinformation will render internet discourse worthless if it is not effectively filtered. The alternative is not just a dramatic devaluation of a revolutionary communication technology, but an acceleration of the decay and decomposition of our culture, which we have already begun to witness.
Public outcry over the deleterious effects of social media on society went from a simmering discontent to a rolling boil of anger when The Wall Street Journal published The Facebook Files in October 2021. Importantly, however, Facebook whistleblower Frances Haugen insists that social media technologies should—and can—be fixed, not eliminated. For that to happen, the corruption of communication that they facilitate must first be recognized for the foundational problem that it is. After all, no other social problem can be effectively addressed with a communication system that encourages misinformation.
Enter Public Editor, an inevitable evolution of communication technology that identifies and labels specific misinformation appearing in the online news articles which constitute the foundation of our public discourse. Public Editor has built a collective intelligence system that labels over 40 types of misinformation in the most-shared news articles on the internet within 30 minutes of their publication (and has since earned funding from the FTX Foundation, among other sources). It is, in colloquial words, a “people-powered bullshit detector” empowering social media users to define their own threshold of tolerance for misinformation, as well as decide their own level of engagement with efforts to correct misinformation for themselves and their fellow citizens. While some newsreaders may be content to simply set the tolerance filter, others may enjoy the opportunity to drill down and discover how these labels were generated, and ultimately participate in their application to the misleading content within news articles.
In the fast-paced world of tech-enabled media, the process of labeling, filtering, and educating the public about misinformation must occur with similar immediacy. For newsreaders using Public Editor, corrective labels are accessible the very moment they are reading an article, not days later, and not as part of some other conversation in a new context. In the same way that a tool like Waze helps users better navigate their cities based on structured, real-time feedback from other drivers, Public Editor helps users better navigate the news cycle based on structured, real-time feedback from other readers.
Crucially, Public Editor is not some black box AI generating unaccountable results. It warns news readers off of over 40 types of misinformation using a totally transparent, people-powered process based upon centuries of continually-improving scientific method. And like Wikipedia, it is a project that anyone can take part in, to build their own discernment and ultimately the collective discernment of society. To this end, it is necessary not only to filter misinformation, but to empower people to spot it, to contend with it if/when they choose, to learn from it, and to raise their own standards of credulity. After all, the transition from the vampiric to the bacterial theory of tuberculosis not only required improvements in scientific understanding of disease. It also required widespread public learning about that update to the disease model. Today, we must not only identify the actual mechanics of the disease of misinformation, but also enlighten public culture accordingly. A predictable downstream effect of this would be less misinformation in news media over time, as Public Editor introduces a layer of accountability that is already regarded as necessary and inevitable across the internet, from Uber to Airbnb to Amazon.
Every arena of human activity benefits from such social knowledge sharing. Since ancient times, we have relied on rumor networks, gossip, and word-of-mouth referrals. Today, those same social mechanisms for evaluating reputation have been carved in code as Rotten Tomatoes, Yelp, Angi, and the ability to access internet reviews across all dimensions of the economy. Public Editor advances this technological trend towards collectively evaluating products, companies, and services to a more granular level –– evaluating the specific claims within news articles. Thus, in the same way you probably would not spend 2 hours of your life watching a movie that received a 13% rating on Rotten Tomatoes, you probably would not spend 5 minutes of your life reading an article that received a 13% rating on Public Editor (unless you wanted to study examples of the many ways we humans fool each other through misleading and biased reasoning).
It should not be surprising that the novel challenges introduced by technology necessitate technological solutions operating at the same scale. Government regulators and moral entrepreneurs alone can barely address—let alone solve—these problems foisted upon society by the explosive emergence of attention-optimizing social media technologies. Public Editor allows us to address these challenges immediately, without having to wait for the political process to sort it out. Remarkably, with a prosocial technology that meets the problem within the digital sphere itself, there is not even a need for revisions to Section 230 of the Communications Decency Act, as some have suggested. Indeed, the CDA already states that it is the policy of the US Government to support and remove barriers to the use of third-party content labelers and filters (like Public Editor) that users of interactive computer services (like Facebook and Twitter) can access to improve their experience on these platforms.
S230 (b) POLICY- It is the policy of the United States--
to promote the continued development of the Internet and other interactive computer services and other interactive media;
to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation;
to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services;
to remove disincentives for the development and utilization of blocking and filtering technologies...
Ultimately, it is unreasonable to accept the corruption of communication as the price of a communication revolution –– that actually constitutes a communication devolution. Fortunately, not only does Public Editor represent the inevitable evolution of communication technology, it is already in hand. In a proof of concept, the Global Engagement Center of the US State Department, in cooperation with the EU’s East Stratcom TaskForce, ran a pilot of Public Editor to test its capacity to identify misinformation and disinformation (i.e. intentional misinformation) appearing within 250 news articles published by known propaganda sources. Public Editor highlighted the key problem identified by ‘EU vs. Disinfo’ experts in 94% of these articles, and identified equivalent errors in 72% of them. Public Editor also found additional mistakes not identified by ‘EU vs. Disinfo’ experts in 86% of the articles, and provided useful credibility information to newsreaders (as determined by reviewers) for 100% of the articles.
Every great technology is heralded for the problems it solves and then despised for those it creates. And then, just as inevitably for every surviving technology, it is improved upon. Early automobiles, for example, had plate glass windshields, resulting in people being impaled through the neck by shards of glass in low-speed collisions –– and the automobile industry even resisted the introduction of safety glass when it was first developed. But obviously, over time, automobile engineering has evolved to make cars much safer. We don’t abandon a useful technology; we learn how to minimize its problems and maximize its contributions.
That our system of government has grown increasingly polarized since the turn of the millenium is not coincidental to the aforementioned triple communications revolution of the internet, social media, and smartphones. There is no doubt we are facing a misinformation crisis –– a crisis of epistemology. And there is little doubt that political polarization and gridlock has been exacerbated by social media technologies optimized for attention and ad revenue and adopted on a massive scale without a full understanding of their social psychological effects. The accelerating breakdown of social structures we are now witnessing is an entirely predictable consequence of a communications system corrupted by misinformation. If we consider nearly three billion Facebook users as beta testers of communications revolution 1.0, surfacing its multifarious problems, Public Editor represents version 2.0, designed to disempower misinformation and empower truth. Fortunately, in the long view of history, this communications revolution has not ended. In fact, it has only just begun.