The Art of Deplatforming | Part III | Blurred Lines
III of IV | rules, interpretation, the line, Kafka, demonetization |
Every traffic light is a tombstone.
— Tarleton Gillespie, The Custodians of the Internet, 2018.
The inconsistency is not a symptom, but a strategy.
I. Vanishing Infrastructure
Like intergalactic space, the farthest reaches of infrastructure are often hidden from view. Think of the postal service. One can see this postman or that postwoman, this or that post office, but one cannot fit the postal system within a single frame.ᶦ Infrastructure defies perception, whether optic or photographic. One can picture the structures (postman, postwoman, post van, post office), but the infrastructure (the postal system) forever remains a blur—a horizonal no man’s land of plausible deniability. When the post arrives in the letterbox, one never knows exactly how. When the post fails to arrive, one never knows exactly why.
As infrastructures of scale and speed that—with each passing day—are making the postal system seem like caveman technology, platforms have come to preside over vast swathes of grey area—social, political, legal and moral. Akin to the stealthy and liminal rise of Uber, which baked its business model into the cracks of the transport system,ᶦᶦ the MATAMA collective—Meta, Alphabet, Twitter, Amazon, Microsoft, Apple—has come to rely on the unphotographable aspect of infrastructure. Where’s the line? That’s for MATAMA to decide, undecide and redecide behind closed doors. To paraphrase Adam Serwer, the ambiguity is the point.ᶦᶦᶦ
II. Kingdoms of Words
On the front-end of things, when Twitter first published a list of “Rules” in 2009, the single webpage contained a mere 568 words.ᶦᵛ By 2016, the wordcount had doubled. Today, the several pages of “Rules and Policies” stand at several thousand words. On the back-end of things, right up until Facebook first published a version of “Community Standards” in November of 2009, the only literature that existed on content moderation was a one-page guidebook, which contained such rules as “Feel bad? Take it down.”ᵛ By December of 2009, following the hiring of lawyer Dave Willner, the one-page rulebook had grown into a 15,000-word document. Who knows the thickness of the rulebook today. There is no straight line between information and being informed, however. In fact, in extrema, the two are inversely correlated. As criminologist and statistician Richard Berk has written: “Providing overwhelming amounts of information without adequate structure or documentation is not transparency.”ᵛᶦ Details and the devil go well together.
In legal theory, the formal resolution of standards, ‘do not drive too fast,’ and rules, ‘do not drive faster than sixty-five miles per hour,’ has become an ongoing source of confusion and controversy. In theory and practice, platforms are sowing the seeds of irresolution: not only between standards and rules, but between guidelines and policies, principles and practices, front-end PR and back-end OS. While Facebook states that its “standards of behaviour apply to everyone, no matter their status or fame,”ᵛᶦᶦ its flirtation with rules such as “whitelisting” states otherwise (more on that later). Whereas 2009-2015 Twitter declared itself the “the free speech wing of the free speech party,” which—no matter the pressure to bend “these principles”—would not “actively monitor” or “censor” its users, 2015-2022 Twitter has corrected the record on a number of occasions.ᵛᶦᶦᶦ Interpretative flexibility is less an accident than an insurance policy.
III. Blurred Lines
While Facebook acknowledges the fact that there is only “a fine line between false news and satire or opinion,”ᶦˣ Instagram (a Meta sibling) warns that any “overstepping” may kickstart the gears of deplatforming: “deleted content, disabled accounts, or other restrictions.”ˣ Under the guise of confounding users who will “seek to use the information to game the system,”ˣᶦ Twitter keeps the casualties of deplatforming in a Kafkian state of darkness, unaware of which rule was violated and, therefore, unaware of which violation to appeal. “There’s been little to no transparency on which Twitter accounts were taken down and why,” writes journalist Russell Brandom, “there’s no real justification for taking down the Red Scare podcast and not Ayatollah Khamenei.”ˣᶦᶦ At YouTube, meanwhile, due to the sheer volume of content, the line has become a key component of core infrastructure: “Our reviewers make decisions based on comprehensive enforcement guidelines and are very well calibrated on where the line is.”ˣᶦᶦᶦ
Creators are far less calibrated, however. For as media scholars David B. Nieberg and Thomas Poell have exposed in Apple’s “App Store Review Guidelines,” the line is anything but straightforward: “We will reject apps for any content or behavior that we believe is over the line. What line, you ask? Well, as a Supreme Court Justice once said, ‘I’ll know it when I see it.’ And we think that you will also know it when you cross it” [Italics added].ˣᶦᵛ For those struggling to keep their heads above water, that is simply too late.
Furthermore, the line can cross you. Indeed, despite beginning life as a bulwark against illegal content, it was not long before linear rhetoric was being applied to “disputable claims of pirated content, nudity, or content transgressions that were simply against a platform’s ToS.”ˣᵛ Terms of service, of course, which never sit still.
IV. Too Big to Feel
For mainstream creators, the blurriness of the line poses little existential risk: firstly, because such “repairable lapses” tend to occur along the “slack” at the marginsˣᵛᶦ; and secondly, because the potential downsides of deplatforming—blacklisting, suspending, demonetizing, shadowbanning, banning, et cetera—are often weatherable storms, financially speaking. For marginal creators, however, who need to follow the rules so as to rise out or stay out of creator poverty,ˣᵛᶦᶦ navigating the various ToS patchworks has become a four-dimensional tightrope: for rules (and repercussions) of deplatforming not only ripple forwards and sideways, but backwards.ˣᵛᶦᶦᶦ
In February of 2017, when an article from The Washing Post resurfaced an old clip of gamer Felix “PewDiePie” Kjellberg crossing the line,ˣᶦˣ the ripple effects of YouTube’s reaction were not limited to PewDiePie—a channel with 53 million subscribers in 2017 (now, 111 million)—but rather distributed across by thousands (if not millions) of peripheral channels, for whom the algorithmic danceˣˣ had suddenly become a danse macabre.
In the wake of PewDiePie scandal, a “fluctuation” during the “fine tuning” of AdSenseˣˣᶦ (the algorithm that connects YouTubers to advertisers) almost ended the career of the 21-year-old politico David Pakman. Overnight, the advertising revenue for The David Pakman Show—a channel with 353,000 subscribers (now, 1.4 million)—dropped to as little as 6 cents per day, forcing the Pakman to set up an emergency account on Patreon to cover the $20,000-per-month operating costs of Pakman and his team (another cast of characters, downstream of the headlines, whose livelihoods are often overlooked). “This is an existential threat to the show,” Pakman posted, amid the furore. “We need that money.” Despite having aired on public radio and been vetted by the FCC, the back-catalogue of The David Pakman Show had crossed the line: not any existing line, drawn by the YouTube-the-community, but an expedient one, drawn the YouTube-the-corporation. One can only wonder how many Pakmans did not weather the storm.
V. Adhocracy
As loosely-defined ‘best practices’ concretize into ad hoc categories like “Rules and Policies” or “Community Guidelines,” platforms become part of the societal furniture: seamless and seen less. Ambiguity functions as a form of corporate camouflage, which surrounds the infrastructural core in wet cement: always drying, but never dry.
Over time, the strategic narrowing and widening of the line has become an artform—a dark artform—that has granted platforms the ability to keep their double-, triple-, n-tuple dealings at arm’s length (yet within reach). Through the rhetorical extension of ‘standards’ and ‘principles’ beyond what ‘rules’ and ‘practices’ will support, platforms feed not only false hope, but false rope to marginal creators: a soft form of stochastic terrorism,ˣˣᶦᶦ which simultaneously (dis)incentivizes the risks of creativity. The infrastructural edge becomes a retractable ledge, which can be pulled back at the slightest whisper of public or market pressure. To paraphrase economist Rolf Winkler: the margins take all the risk, the mainstream takes all the reward.ˣˣᶦᶦᶦ
For platforms, who would prefer neither the margins nor the masses come to “understand how either law or sausages comes to be made,”ˣˣᶦᵛ deplatforming has become a meal best served silently. In a leaked speech from 2015, however, Dick Costolo, the CEO of Twitter, said the quiet part out loud: “We’re going to start kicking these people off right and left and making sure that when they issue their ridiculous attacks, nobody hears them” [Italics added].ˣˣᵛ
References
ᶦ Lisa Parks, “‘Stuff You Can Kick’: Toward a Theory of Media Infrastructures,” in Between Humanities and the Digital, eds. Patrik Svensson and David Theo Goldberg (Cambridge: MIT Press, 2015), 359.
ᶦᶦ Raghu Garud, Arun Kumaraswamy, Anna Roberts & Le Xu, “Liminal Movement by Digital Platform-Based Sharing Economy Ventures: The Case of Uber Technologies,” Strategic Management Journal, Vol. 42, No.3 (2022).
ᶦᶦᶦ Adam Serwer, The Cruelty Is the Point: The Past, Present, and Future of Trump's America (London: One World, 2021).
ᶦᵛ Sarah Jeong, “The History of Twitter's Rules,” Vice, 14 January 2016. https://www.vice.com/en/article/z43xw3/the-history-of-twitters-rules.
ᵛ Klonick, Kate, “The New Governors: The People, Rules, and Processes Governing Online Speech,” Harvard Law Review, Vol. 131 (2017), 1631.
ᵛᶦ Richard Berk: quoted in Brian Christian, The Alignment Problem (New York: W. W. Norton & Company, 2020), 82.
ᵛᶦᶦ Jeff Horwitz, “Facebook Says Its Rules Apply to All. Company Documents Reveal a Secret Elite That’s Exempt,” The Wall Street Journal, 13 September 2021. https://www.wsj.com/amp/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353.
ᵛᶦᶦᶦ Jeong, “The History of Twitter's Rules.”
ᶦˣ Facebook, “Community Standards,” Facebook Transparency Centre. https://transparency.fb.com/en-gb/policies/community-standards/ (2021).
ˣ Instagram: quoted in Jessica Maddox & Jennifer Malson, “Guidelines Without Lines, Communities Without Borders: The Marketplace of Ideas and Digital Manifest Destiny in Social Media Platform Policies,” Social Media Society, Vol. 6, No. 2 (2020), 6.
ˣᶦ Dieter Bohn, “One of Twitter’s New Anti-Abuse Measures Is the Oldest Trick in the Forum Moderation Book,” The Verge, 16 February 2017.
ˣᶦᶦ Russell Brandom, “Why Platforms Had to Cut Off Trump and Parler,” The Verge, 11 January 2021. https://www.theverge.com/22224860/parler-trump-deplatformed-capitol-raid-moderation-censorship-facebook-amazon-twitter.
ˣᶦᶦᶦ YouTube, “Rules and Policies, How YouTube Works. https://www.youtube.com/howyoutubeworks/policies/.
ˣᶦᵛ Apple: quoted in David D. Nieberg & Thomas Poell, “The Platformization of Cultural Production: Theorizing the Contingent Cultural Commodity,” New Media & Society, Vol. 20, No. 11 (2018), 4287.
ˣᵛ José Van Dijck, Tim de Winkel & Mirko Tobias Schäfer, “Deplatformization and the Governance of the Platform Ecosystem,” New Media & Society, Vol. 1, No. 7 (2021), 3.
ˣᵛᶦ Albert O. Hirschman, Exit, Voice, and Loyalty (Cambridge: Harvard University Press, 1970), 1.
ˣᵛᶦᶦ Xiaoren Wang, “YouTube Creativity and the Regulator’s Dilemma: An Assessment of Factors Shaping Creative Production on Video-Sharing Platforms,” Albany Law Journal of Science and Technology, Vol. 32, No. 3 (Forthcoming).
ˣᵛᶦᶦᶦ Robyn Caplan & Tarleton Gillespie, “Tiered Governance and Demonetization: The Shifting Terms of Labor and Compensation in the Platform Economy,” Social Media + Society, Vol. 6, No. 2 (2020), 7.
ˣᶦˣ Julia Alexander, “The Golden Age of YouTube Is Over,” The Verge, 5 April 2019. https://www.theverge.com/2019/4/5/18287318/youtube-logan-paul-pewdiepie-demonetization-adpocalypse-premium-influencers-creators.
ˣˣ Sangeet Kumar, “Algorithmic Dance: YouTube’s Adpocalypse and the Gatekeeping of Cultural Content on Digital Platforms,” Internet Policy Review, Vol. 8, No. 2 (2019).
ˣˣᶦ Amanda Hess, “How YouTube’s Shifting Algorithms Hurt Independent Media,” The New York Times, 17 April 2017. https://www.nytimes.com/2017/04/17/arts/youtube-broadcasters-algorithm-ads.html.
ˣˣᶦᶦ Molly Amman & J. Reid Meloy, “Stochastic Terrorism: A Linguistic and Psychological Analysis,” Perspectives on Terrorism, Vol. 15, No. 5 (2021).
ˣˣᶦᶦᶦ Rolfe Winkler: quoted in Nassim Nicholas Taleb & Charles S. Tapiero, “Risk Externalities and Too Big to Fail,” Physica A: Statistical Mechanics and its Applications, Vol. 389, No. 17 (2010), 3503.
ˣˣᶦᵛ Susan Leigh Star & Geoffrey C. Bowker, “Enacting Silence: Residual Categories as a Challenge for Ethics, Information Systems, and Communication,” Ethics and Information Technology, Vol. 9, No. 4 (2007).
ˣˣᵛ Dick Costolo: quoted in Tarleton Gillespie, Custodians of the Internet (New Haven: Yale University Press, 2018), 25.